CBS News
Mayorkas says some migrants “try to game” the U.S. asylum system
El Paso, Texas — In an interview with CBS News, Homeland Security Secretary Alejandro Mayorkas said some migrants coming to the U.S.-Mexico border are trying to “game” the U.S. asylum system, echoing a statement often made by Republicans but rarely expressed by Biden administration officials.
“The reality is that some people do indeed try to game the system,” Mayorkas told CBS News in El Paso last Thursday. “That does not speak to everyone whom we encounter, but there is an element of it, and we deal with it accordingly.”
Mayorkas made the comment in response to a question about concerns some Americans have expressed about the situation at the southern border, where U.S. officials have reported record levels of migrant apprehensions over the past three years. Immigration has become one of President Biden’s worst-polling issues, as well as a top concern for voters heading into November’s presidential election.
For years, Republicans, including former President Donald Trump, have accused migrants of cheating or abusing the U.S. asylum process to stay in the country indefinitely, arguing that restrictions or bans on asylum need to be enacted to deter those who don’t qualify from filing weak or non-existent cases.
When speaking of reforming the U.S. asylum system, however, Democrats and Biden administration officials like Mayorkas have mainly talked about the need to speed up the processing of claims, to quickly grant asylum to those who qualify for protection and deport those who don’t.
U.S. law allows migrants physically on American soil to request asylum, even if they enter the country unlawfully. But applicants have to prove they are fleeing persecution because of their nationality, race, religion, political views or membership in a social group. Many migrants who initially apply for asylum are ultimately unable to meet the legal threshold to receive it, government figures show.
During the interview last week, Mayorkas said a border security proposal he helped broker with a small bipartisan group of lawmakers in the Senate “would have equipped us with more tools to deal with those individuals who seek to game the system.”
The legislation, which has collapsed twice due to insufficient Republican support, would raise the threshold for passing initial asylum interviews and create a presidential power to shut down asylum processing in between ports of entry when illegal border crossings soar.
“We would drive traffic to our ports of entry in an orderly way,” Mayorkas said about the bill, which would preserve asylum processing at official border crossings when the presidential “shutdown” authority is triggered or invoked.
The Biden administration and Mayorkas have faced a tidal wave of criticism from Republican lawmakers over the unprecedented levels of migration to the U.S. southern border in recent years. Mayorkas became the first Cabinet official to be impeached since the 1870s in March, when House Republicans accused him of breaching the public’s trust and failing to fully enforce federal immigration laws.
Mayorkas said the accusation that Biden administration policy has encouraged desperate migrants to journey to the U.S. is “false.”
“The reasons why people leave their countries of origin are those with which we are quite familiar: extraordinary poverty, violence, extreme weather events, corruption, suppression by authoritarian regimes. Those reasons and more,” Mayorkas said.
While Mr. Biden came into office vowing to “restore” the asylum system, his administration has embraced some limits on asylum, including a rule that presumes migrants are ineligible for refuge if they failed to seek protection in a third country. Mr. Biden is also considering an executive action that would attempt to suspend asylum processing when there’s an influx in illegal border entries.
Migrant crossings along the U.S.-Mexico border have dropped by over 50% this month since setting all-time highs in December, a trend American officials said mainly stems from Mexico’s efforts to stop migrants and increased deportations by the Biden administration.
CBS News
How Kenya became the “Silicon Savannah”
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.
CBS News
Kenyan workers with AI jobs thought they had tickets to the future until the grim reality set in
Being overworked, underpaid, and ill-treated is not what Kenyan workers had in mind when they were lured by U.S. companies with jobs in AI.
Kenyan civil rights activist Nerima Wako-Ojiwa said the workers’ desperation, in a country with high unemployment, led to a culture of exploitation with unfair wages and no job security.
“It’s terrible to see just how many American companies are just doing wrong here,” Wako-Ojiwa said. “And it’s something that they wouldn’t do at home, so why do it here?”
Why tech giants come to Kenya
The familiar narrative is that artificial intelligence will take away human jobs, but right now it’s also creating jobs. There’s a growing global workforce of millions toiling to make AI run smoothly. It’s gruntwork that needs to be done accurately and fast. To do it cheaply, the work is often farmed out to developing countries like Kenya.
Nairobi, Kenya, is one of the main hubs for this kind of work. It’s a country desperate for work. The unemployment rate is as high as 67% among young people.
“The workforce is so large and desperate that they could pay whatever and have whatever working conditions, and they will have someone who will pick up that job,” Wako-Ojiwa said.
Every year, a million young people enter the job market, so the government has been courting tech giants like Microsoft, Google, Apple and Intel. Officials have promoted Kenya as a “Silicon Savannah” — tech savvy and digitally connected.
Kenyan President William Ruto has offered financial incentives on top of already lax labor laws to attract the tech companies.
What “humans in the loop” do with AI
Naftali Wambalo, a father of two with a college degree in mathematics, was elated to find work in Nairobi in the emerging field of artificial intelligence. He is what’s known as a “human in the loop”: someone sorting, labeling and sifting through reams of data to train and improve AI for companies like Meta, OpenAI, Microsoft and Google.
Wambalo and other digital workers spent eight hours a day in front of a screen studying photos and videos, drawing boxes around objects and labeling them, teaching AI algorithms to recognize them.
Human labelers tag cars and pedestrians to teach autonomous vehicles not to hit them. Humans circle abnormalities in CTs, MRIs and X-rays to teach AI to recognize diseases. Even as AI gets smarter, humans in the loop will always be needed because there will always be new devices and inventions that’ll need labeling.
Humans in the loop are found not only in Kenya, but also in India, the Philippines and Venezuela. They’re often countries with low wages but large populations — well educated, but unemployed.
Unfair labor practices
What seemed like a ticket to the future was quickly revealed to be anything but for some humans in the loop, who say they’ve been exploited. The jobs offer no stability – some contracts only offer employment for a few days, some weekly and others monthly, Wako-Ojiwa said. She calls the workspaces AI sweatshops with computers instead of sewing machines.
The workers aren’t typically hired directly by the big tech companies – instead, they are employed by mostly American outsourcing companies.
The pay for humans in the loop is $1.50-2 an hour.
“And that is gross, before tax,” Wambalo said.
Wambalo, Nathan Nkunzimana and Fasica Berhane Gebrekidan were employed by SAMA, an American outsourcing company that hired for Meta and OpenAI. SAMA, based in the California Bay Area, employed over 3,000 workers in Kenya. Documents reviewed by 60 Minutes show OpenAI agreed to pay SAMA $12.50 an hour per worker, much more than the $2 the workers actually got, though SAMA says what it paid is a fair wage for the region.
Wambalo disagrees.
“If the big tech companies are going to keep doing this business, they have to do it the right way,” he said. “It’s not because you realize Kenya’s a third-world country, you say, ‘This job I would normally pay $30 in U.S., but because you are Kenya, $2 is enough for you.'”
Nkunzimana said he took the job because he has a family to feed.
Berhane Gebrekidan lived paycheck to paycheck, unable to save anything. She said she saw people who were fired for complaining.
“We were walking on eggshells,” she said.
They say SAMA pushed workers to complete assignments faster than the companies required, an allegation SAMA denies. If a six-month contract was completed in three months, they could be out of work without any pay for those extra months. They did say Sama would reward them for fast work.
“They used to say ‘thank you.’ They give you a bottle of soda and KFC chicken. Two pieces. And that is it,” Wambalo said.
Ephantus Kanyugi, Joan Kinyua, Joy Minayo, Michael Geoffrey Asia and Duncan Koech all worked for Remotaks, a click-work platform operated by Scale AI — another American AI training company facing criticism in Kenya. Workers signed up online and selected remote work, getting paid per task. They said they sometimes went unpaid.
“When it gets to the day before payday, they close the account and say that you violated a policy,” Kanyugi said.
Employees say they have no recourse or even a way to complain.
The company told 60 Minutes that any work done “in line with our community guidelines was paid out.” In March, as workers started complaining publicly, Remotasks abruptly shut down in Kenya, locking all workers out of their accounts.
The mental toll of AI training
Workers say some of the projects for Meta and OpenAI also caused them mental harm. Wambalo was assigned to train AI to recognize and weed out pornography, hate speech and excessive violence from social media. He had to sift through the worst of the worst content online for hours on end.
“I looked at people being slaughtered,” Wambalo said. “People engaging in sexual activity with animals. People abusing children physically, sexually. People committing suicide.”
Berhane Gebrekidan thought she’d been hired for a translation job, but she said what she ended up doing was reviewing content featuring dismembered bodies and drone attack victims.
“I find it hard now to even have conversations with people,” she said. “It’s just that I find it easier to cry than to speak.”
Wambalo said the material he had to review online has hurt his marriage.
“After countlessly seeing those sexual activities, pornography on the job, that I was doing, I hate sex,” he said.
SAMA says mental health counseling was provided by “fully-licensed professionals.” Workers say it was woefully inadequate.
“We want psychiatrists,” Wambalo said. “We want psychologists, qualified, who know exactly what we are going through and how they can help us to cope.”
Workers fight back
Wambalo and Berhane Gebrekidan are among around 200 digital workers suing SAMA and Meta over “unreasonable working conditions” that caused them psychological problems.
“It was proven by a psychiatrist that we are thoroughly sick,” Nathan Nkunzimana said “We have gone through a psychiatric evaluation just a few months ago and it was proven that we are all sick, thoroughly sick.”
Wambalo said it’s the responsibility of the big tech companies to know how the jobs are impacting workers.
“They are the ones providing the work,” he said.
Berhane Gebrekidan feels the companies know the people they employ are struggling, but they don’t care.
“…Just because we’re Black, or just because we’re just vulnerable for now, that doesn’t give them the right to just exploit us like this,” she said.
Kenya does have labor laws, but they are outdated and don’t touch on digital labor, Wako-Ojiwa, the civil rights activist, said.
“I do think that our labor laws need to recognize it, but not just in Kenya alone,” Wako-Ojiwa said. “Because what happens is when we start to push back, in terms of protections of workers, a lot of these companies…they shut down and they move to a neighboring country.”
SAMA has terminated the harmful content projects Wambalo and Berhane Gebrekidan were working on. The company would not agree to an on-camera interview and neither would Scale AI, which operated the Remotasks website in Kenya.
Meta and OpenAi told 60 Minutes they’re committed to safe working conditions, including fair wages and access to mental health counseling.
CBS News
Labelers training AI say they’re overworked, underpaid and exploited by big American tech companies
The familiar narrative is that artificial intelligence will take away human jobs: machine-learning will let cars, computers and chatbots teach themselves – making us humans obsolete.
Well, that’s not very likely, and we’re gonna tell you why. There’s a growing global army of millions toiling to make AI run smoothly. They’re called “humans in the loop:” people sorting, labeling, and sifting reams of data to train and improve AI for companies like Meta, OpenAI, Microsoft and Google. It’s gruntwork that needs to be done accurately, fast, and – to do it cheaply – it’s often farmed out to places like Africa –
Naftali Wambalo: The robots or the machines, you are teaching them how to think like human, to do things like human.
We met Naftali Wambalo in Nairobi, Kenya, one of the main hubs for this kind of work. It’s a country desperate for jobs… because of an unemployment rate as high as 67% among young people. So Naftali, father of two, college educated with a degree in mathematics, was elated to finally find work in an emerging field: artificial intelligence.
Lesley Stahl: You were labeling.
Naftali Wambalo: I did labeling for videos and images.
Naftali and digital workers like him, spent eight hours a day in front of a screen studying photos and videos, drawing boxes around objects and labeling them, teaching the AI algorithms to recognize them.
Naftali Wambalo: You’d label, let’s say, furniture in a house. And you say “This is a TV. This is a microwave.” So you are teaching the AI to identify these items. And then there was one for faces of people. The color of the face. “If it looks like this, this is white. If it looks like this, it’s Black. This is Asian.” You’re teaching the AI to identify them automatically.
Humans tag cars and pedestrians to teach autonomous vehicles not to hit them. Humans circle abnormalities to teach AI to recognize diseases. Even as AI is getting smarter, humans in the loop will always be needed because there will always be new devices and inventions that’ll need labeling.
Lesley Stahl: You find these humans in the loop not only here in Kenya but in other countries thousands of miles from Silicon Valley. In India, the Philippines, Venezuela – often countries with large low wage populations – well educated but unemployed.
Nerima Wako-Ojiwa: Honestly, it’s like modern-day slavery. Because it’s cheap labor–
Lesley Stahl: Whoa. What do you –
Nerima Wako-Ojiwa: It’s cheap labor.
Like modern day slavery, says Nerima Wako-Ojiwa, a Kenyan civil rights activist, because big American tech companies come here and advertise the jobs as a ticket to the future. But really, she says, it’s exploitation.
Nerima Wako-Ojiwa: What we’re seeing is an inequality.
Lesley Stahl: It sounds so good. An AI job! Is there any job security?
Nerima Wako-Ojiwa: The contracts that we see are very short-term. And I’ve seen people who have contracts that are monthly, some of them weekly, some of them days. Which is ridiculous.
She calls the workspaces AIi sweatshops with computers instead of sewing machines.
Nerima Wako-Ojiwa: I think that we’re so concerned with “creating opportunities,” but we’re not asking, “Are they good opportunities?”
Because every year a million young people enter the job market, the government has been courting tech giants like Microsoft, Google, Apple, and Intel to come here, promoting Kenya’s reputation as the Silicon Savannah: tech savvy and digitally connected.
Nerima Wako-Ojiwa: The president has been really pushing for opportunities in AI –
Lesley Stahl: President?
Nerima Wako-Ojiwa: Yes.
Lesley Stahl: Ruto?
Nerima Wako-Ojiwa: President Ruto. Yes. The president does have to create at least one million jobs a year the minimum. So it’s a very tight position to be in.
To lure the tech giants, Ruto has been offering financial incentives on top of already lax labor laws. but the workers aren’t hired directly by the big companies. They engage outsourcing firms – also mostly American – to hire for them.
Lesley Stahl: There’s a go-between.
Nerima Wako-Ojiwa: Yes.
Lesley Stahl: They hire? They pay.
Nerima Wako-Ojiwa: Uh-huh (affirm). I mean, they hire thousands of people.
Lesley Stahl: And they are protecting the Facebooks from having their names associated with this?
Nerima Wako-Ojiwa: Yes yes yes.
Lesley Stahl: We’re talking about the richest companies on Earth.
Nerima Wako-Ojiwa: Yes. But then they are paying people peanuts.
Lesley Stahl: AI jobs don’t pay much?
Nerima Wako-Ojiwa: They don’t pay well. They do not pay Africans well enough. And the workforce is so large and desperate that they could pay whatever, and have whatever working conditions, and they will have someone who will pick up that job.
Lesley Stahl: So what’s the average pay for these jobs?
Nerima Wako-Ojiwa: It’s about a $1.50, $2 an hour.
Naftali Wambalo: $2 per hour, and that is gross before tax.
Naftali, Nathan, and Fasica were hired by an American outsourcing company called SAMA – that employs over 3,000 workers here and hired for Meta and OpenAI. In documents we obtained, OpenAI agreed to pay SAMA $12.50 an hour per worker, much more than the $2 the workers actually got – though, SAMA says, that’s a fair wage for the region –
Naftali Wambalo: If the big tech companies are going to keep doing this– this business, they have to do it the right way. So it’s not because you realize Kenya’s a third-world country, you say, “This job I would normally pay $30 in U.S., but because you are Kenya $2 is enough for you.” That idea has to end.
Lesley Stahl: OK. $2 an hour in Kenya. Is that low, medium? Is it an OK salary?
Fasica: So for me, I was living paycheck to paycheck. And I have saved nothing because it’s not enough.
Lesley Stahl: Is it an insult?
Nathan: It is, of course. It is.
Fasica: It is.
Lesley Stahl: Why did you take the job?
Nathan: I have a family to feed. And instead of staying home, let me just at least have something to do.
And not only did the jobs not pay well – they were draining. They say deadlines were unrealistic, punitive – with often just seconds to complete complicated labeling tasks.
Lesley Stahl: Did you see people who were fired just ’cause they complained?
Fasica: Yes, we were walking on eggshells.
They were all hired per project and say SAMA kept pushing them to complete the work faster than the projects required, an allegation SAMA denies.
Lesley Stahl: Let’s say the contract for a certain job was six months, OK? What if you finished in three months? Does the worker get paid for those extra three months?
Male voice: No –
Fasica: KFC.
Lesley Stahl: What?
Fasica: We used to get KFC and Coca Cola.
Naftali Wambalo: They used to say thank you. They give you a bottle of soda and KFC chicken. Two pieces. And that is it.
Worse yet, workers told us that some of the projects for Meta and OpenAI were grim and caused them harm. Naftali was assigned to train AI to recognize and weed out pornography, hate speech and excessive violence, which meant sifting through the worst of the worst content online for hours on end.
Naftali Wambalo: I looked at people being slaughtered, people engaging in sexual activity with animals. People abusing children physically, sexually. People committing suicide.
Lesley Stahl: All day long?
Naftali Wambalo: Basically- yes, all day long. Eight hours a day, 40 hours a week.
The workers told us they were tricked into this work by ads like this that described these jobs as “call center agents” to “assist our clients’ community and help resolve inquiries empathetically.”
Fasica: I was told I was going to do a translation job.
Lesley Stahl: Exactly what was the job you were doing?
Fasica: I was basically reviewing content which are very graphic, very disturbing contents. I was watching dismembered bodies or drone attack victims. You name it. You know, whenever I talk about this, I still have flashbacks.
Lesley Stahl: Are any of you a different person than they were before you had this job?
Fasica: Yeah. I find it hard now to even have conversations with people. It’s just that I find it easier to cry than to speak.
Nathan: You continue isolating you– yourself from people. You don’t want to socialize with others. It’s you and it’s you alone.
Lesley Stahl: Are you a different person?
Naftali Wambalo: Yeah. I’m a different person. I used to enjoy my marriage, especially when it comes to bedroom fireworks. But after the job I hate sex.
Lesley Stahl: You hated sex?
Naftali Wambalo: After countlessly seeing those sexual activities pornography on the job that I was doing, I hate sex.
SAMA says mental health counseling was provided by quote “fully licensed professionals.” But the workers say it was woefully inadequate.
Naftali Wambalo: We want psychiatrists. We want psychologists, qualified, who know exactly what we are going through and how they can help us to cope.
Lesley Stahl: Trauma experts.
Naftali Wambalo: Yes.
Lesley Stahl: Do you think the big company, Facebook, ChatGPT, do you think they know how this is affecting the workers?
Naftali Wambalo: It’s their job to know. It’s their f***ing job to know, actually– because they are the ones providing the work.
These three and nearly 200 other digital workers are suing SAMA and Meta over “unreasonable working conditions” that caused psychiatric problems
Nathan: It was proven by a psychiatrist that we are thoroughly sick. We have gone through a psychiatric evaluation just a few months ago and it was proven that we are all sick, thoroughly sick.
Fasica: They know that we’re damaged but they don’t care. We’re humans just because we’re black, or just because we’re just vulnerable for now, that doesn’t give them the right to just exploit us like this.
SAMA – which has terminated those projects – would not agree to an on-camera interview. Meta and OpenAI told us they’re committed to safe working conditions including fair wages and access to mental health counseling. Another American AI training company facing criticism in Kenya is Scale AI, which operates a website called Remotasks.
Lesley Stahl: Did you all work for Remotasks?
Group: Yes.
Lesley Stahl: Or work with them?
Ephantus, Joan, Joy, Michael, and Duncan signed up online, creating an account, and clicked for work remotely, getting paid per task. Problem is: sometimes the company just didn’t pay them.
Ephantus: When it gets to the day before payday, they close the account and say that “You violated a policy.”
Lesley Stahl: They say, “You violated their policy.”
Voice: Yes.
Lesley Stahl: And they don’t pay you for the work you’ve done—
Ephantus: They don’t.
Lesley Stahl: Would you say that that’s almost common, that you do work and you’re not paid for it?
Joan: Yeah.
Lesley Stahl: And you have no recourse, you have no way to even complain?
Joan: There’s no way.
The company says any work that was done “in line with our community guidelines was paid out.” In March, as workers started complaining publicly, Remotasks abruptly shut down in Kenya altogether.
Lesley Stahl: There are no labor laws here?
Nerima Wako-Ojiwa: Our labor law is about 20 years old, it doesn’t touch on digital labor. I do think that our labor laws need to recognize it– but not just in Kenya alone. Because what happens is when we start to push back, in terms of protections of workers, a lot of these companies, they shut down and they move to a neighboring country.
Lesley Stahl: It’s easy to see how you’re trapped. Kenya is trapped: They need jobs so desperately that there’s a fear that if you complain, if your government complained, then these companies don’t have to come here.
Nerima Wako-Ojiwa: Yeah. And that’s what they throw at us all the time. And it’s terrible to see just how many American companies are just-just doing wrong here– just doing wrong here. And it’s something that they wouldn’t do at home, so why do it here?
Produced by Shachar Bar-On and Jinsol Jung. Broadcast associate, Aria Een. Edited by April Wilson.