CBS News
Who is Michelle Troconis? What we know about suspect on trial for allegedly covering up Jennifer Dulos’ murder
Michelle Troconis, the woman accused in the mysterious killing of Jennifer Dulos in 2019, has pleaded not guilty to the myriad criminal charges brought against her for allegedly helping to cover up the Connecticut mother’s murder.
Troconis is currently on trial at Connecticut Superior Court in Stamford, as hearings that began in January trudge onward and the prosecution presents its case in meticulous detail. The trial was originally scheduled to end before March 1, but its slow pace so far, and the fact that there is no planned testimony for at least three weekdays in February, have cast doubt on whether that deadline can be met.
What happened to Jennifer Dulos?
Dulos vanished on May 24, 2019, after she was last seen dropping off her five children at school. She was at the time going through a turbulent divorce and custody battle with their father, her estranged husband Fotis Dulos, who was about $7 million in debt, according to police warrants. Fotis Dulos was the primary suspect in his wife’s killing until his death following a suicide attempt in 2020, shortly after his arrest for her murder.
When Jennifer Dulos went missing, Troconis was in a relationship with Fotis Dulos and living with her daughter at his home in Farmington, where his wife had previously lived. She became the focus of the case after his death, and was known infamously around the world as “the other woman” once the scandal consumed headlines and then gained more publicity as the subject of the 2021 Lifetime movie “Gone Mom.”
State police accused Fotis Dulos of driving an employee’s truck to New Canaan and riding a bicycle to Jennifer Dulos’ home on the day of her disappearance. They said he attacked his wife in the garage after she returned home from their children’s school before putting her back into her car and driving away. Authorities do not know what transpired after that. Jennifer Dulos’ SUV was found empty at a park in New Canaan about three miles from her house.
Was Jennifer Dulos ever found?
Despite searching that park and other locations, including her husband’s home, authorities were never able to find a body and Jennifer Dulos was later presumed dead. Police said they discovered enough bloodshed in her garage to suggest that she could not have survived injuries sustained in an attack there. They also said evidence suggested a potential attempt to clean up the scene.
What is Michelle Troconis accused of?
A 49-year-old dual citizen of the United States and Venezuela, Troconis is charged in Connecticut with conspiracy to commit murder, tampering with evidence and hindering prosecution for her alleged role in Dulos’ death. Troconis has maintained that she is innocent since the investigation into Dulos’ disappearance began, and was seen telling police in excerpts from early interrogation videos, “I can do whatever you want but I didn’t do it.” The footage was obtained by “48 Hours.”
The night of the killing, Fotis Dulos was captured on public surveillance cameras in Hartford disposing of garbage bags at multiple locations. He drove his truck to each stop and stepped out of it to throw the bags away, while Traconis sat inside the car.
The trash bags were collected days later by police, after they seized Fotis Dulos’ phone and used his location data along with Hartford video footage to find the disposal sites. Investigators said that inside the bags were clothes and zip ties, among other items, that had Jennifer Dulos’ DNA on them, and others that had her husband’s DNA. Some of the items found in those bags contained blood, according to police.
Authorities also said one of the recovered bags had DNA from Jennifer Dulos, Fotis Dulos and Troconis, who later claimed she was unaware of the contents of the bags being thrown away that night.
In the arrest warrant charging Troconis with conspiracy to commit murder, authorities accused her of giving different answers to police in three different interviews during which they asked her whether she had seen Fotis Dulos at home on the day Jennifer Dulos was killed. Fotis Dulos had planned that morning to meet with his friend and attorney Kent Mawhinney, who later told police that they did not meet in the end. Mawhinney was subsequently charged with conspiracy to commit murder as well, and, like Troconis, has pleaded not guilty. He is awaiting trial.
Who is Michelle Troconis’ attorney?
Troconis is being represented by Connecticut attorney Jon Schoenhorn, who immediately requested that his client have a Spanish translator in court after Fotis Dulos’ death. Schoenhorn noted that Spanish is Troconis’ first language, but the request was still seen as a legal maneuver.
As the trial got underway, his arguments largely focused on the lack of physical evidence connecting Troconis to Jennifer Dulos’ murder. The defense has acknowledged that Troconis was in the truck with her former boyfriend as he disposed of garbage bags around Hartford that night, but Schoenhorn said she was mostly on her phone and not really paying attention, thinking that she and Fotis Dulos were “going to Starbucks.”
“This is really the trial of Fotis Dulos,” Schoenhorn said in court last week, as prosecutors took their time these last few weeks presenting detailed evidence collected from those garbage bags to the jury that did not contain Troconis’ DNA and had only traces of blood on them, according to the defense attorney. He said prosecutors first need to prove that Fotis Dulos murdered his wife in order to have grounds to convict Troconis on a conspiracy charge.
What’s going on in the criminal trial?
The prosecution has also heard several witnesses. One of them, Pawel Gumienny, a Polish immigrant who worked for Fotis Dulos’ building company, described the man as “very hard-headed” and told the court that he and Troconis were upset with Jennifer Dulos at the time of her murder, because she would not allow her children to visit the terminally ill family dog at their father’s home.
Gumienny testified that Troconis used aggressive language and slurs when talking about Jennifer Dulos in the context of that situation. “She said, ‘That ***** should be buried right next to the dog,'” Gumienny said. A few days after the disappearance, Gumienny testified that Troconis said, “I’m gonna kill that ******* ***** when she turns up,” when talking about Jennifer Dulos.
It was Gumienny’s vehicle that Fotis Dulos drove to New Canaan on the day his wife was last seen, and prosecutors believe that he deliberately shaved his head and adopted Gumienny’s style in order to turn suspicion on the worker. Prosecutors have said they do not believe Gumienny was involved the killing, but he did receive immunity for his testimony.
Troconis reiterated that she is innocent in a post shared on X, formerly Twitter, in the months leading up to her trial.
“It’s been 4 1/2 years since I was wrongly accused,” Troconis wrote in October. “I have faith in Justice and the jury system.”
Her family has also publicly defended her. Weeks before the trial began, her parents released a statement asking that “everyone involved, especially the investigators, to see the pain and injustice being inflicted upon Michelle and to reconsider the direction of their pursuit.”
“It’s agonizing for us to watch our innocent daughter suffer under the weight of a system that seems to be turning a blind eye to justice,” the family’s statement read. “Our family’s plea is simple: seek the truth and fairness that Michelle so rightly deserves.”
The Associated Press contributed to this report.
CBS News
How Kenya became the “Silicon Savannah”
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.
CBS News
Kenyan workers with AI jobs thought they had tickets to the future until the grim reality set in
Being overworked, underpaid, and ill-treated is not what Kenyan workers had in mind when they were lured by U.S. companies with jobs in AI.
Kenyan civil rights activist Nerima Wako-Ojiwa said the workers’ desperation, in a country with high unemployment, led to a culture of exploitation with unfair wages and no job security.
“It’s terrible to see just how many American companies are just doing wrong here,” Wako-Ojiwa said. “And it’s something that they wouldn’t do at home, so why do it here?”
Why tech giants come to Kenya
The familiar narrative is that artificial intelligence will take away human jobs, but right now it’s also creating jobs. There’s a growing global workforce of millions toiling to make AI run smoothly. It’s gruntwork that needs to be done accurately and fast. To do it cheaply, the work is often farmed out to developing countries like Kenya.
Nairobi, Kenya, is one of the main hubs for this kind of work. It’s a country desperate for work. The unemployment rate is as high as 67% among young people.
“The workforce is so large and desperate that they could pay whatever and have whatever working conditions, and they will have someone who will pick up that job,” Wako-Ojiwa said.
Every year, a million young people enter the job market, so the government has been courting tech giants like Microsoft, Google, Apple and Intel. Officials have promoted Kenya as a “Silicon Savannah” — tech savvy and digitally connected.
Kenyan President William Ruto has offered financial incentives on top of already lax labor laws to attract the tech companies.
What “humans in the loop” do with AI
Naftali Wambalo, a father of two with a college degree in mathematics, was elated to find work in Nairobi in the emerging field of artificial intelligence. He is what’s known as a “human in the loop”: someone sorting, labeling and sifting through reams of data to train and improve AI for companies like Meta, OpenAI, Microsoft and Google.
Wambalo and other digital workers spent eight hours a day in front of a screen studying photos and videos, drawing boxes around objects and labeling them, teaching AI algorithms to recognize them.
Human labelers tag cars and pedestrians to teach autonomous vehicles not to hit them. Humans circle abnormalities in CTs, MRIs and X-rays to teach AI to recognize diseases. Even as AI gets smarter, humans in the loop will always be needed because there will always be new devices and inventions that’ll need labeling.
Humans in the loop are found not only in Kenya, but also in India, the Philippines and Venezuela. They’re often countries with low wages but large populations — well educated, but unemployed.
Unfair labor practices
What seemed like a ticket to the future was quickly revealed to be anything but for some humans in the loop, who say they’ve been exploited. The jobs offer no stability – some contracts only offer employment for a few days, some weekly and others monthly, Wako-Ojiwa said. She calls the workspaces AI sweatshops with computers instead of sewing machines.
The workers aren’t typically hired directly by the big tech companies – instead, they are employed by mostly American outsourcing companies.
The pay for humans in the loop is $1.50-2 an hour.
“And that is gross, before tax,” Wambalo said.
Wambalo, Nathan Nkunzimana and Fasica Berhane Gebrekidan were employed by SAMA, an American outsourcing company that hired for Meta and OpenAI. SAMA, based in the California Bay Area, employed over 3,000 workers in Kenya. Documents reviewed by 60 Minutes show OpenAI agreed to pay SAMA $12.50 an hour per worker, much more than the $2 the workers actually got, though SAMA says what it paid is a fair wage for the region.
Wambalo disagrees.
“If the big tech companies are going to keep doing this business, they have to do it the right way,” he said. “It’s not because you realize Kenya’s a third-world country, you say, ‘This job I would normally pay $30 in U.S., but because you are Kenya, $2 is enough for you.'”
Nkunzimana said he took the job because he has a family to feed.
Berhane Gebrekidan lived paycheck to paycheck, unable to save anything. She said she saw people who were fired for complaining.
“We were walking on eggshells,” she said.
They say SAMA pushed workers to complete assignments faster than the companies required, an allegation SAMA denies. If a six-month contract was completed in three months, they could be out of work without any pay for those extra months. They did say Sama would reward them for fast work.
“They used to say ‘thank you.’ They give you a bottle of soda and KFC chicken. Two pieces. And that is it,” Wambalo said.
Ephantus Kanyugi, Joan Kinyua, Joy Minayo, Michael Geoffrey Asia and Duncan Koech all worked for Remotaks, a click-work platform operated by Scale AI — another American AI training company facing criticism in Kenya. Workers signed up online and selected remote work, getting paid per task. They said they sometimes went unpaid.
“When it gets to the day before payday, they close the account and say that you violated a policy,” Kanyugi said.
Employees say they have no recourse or even a way to complain.
The company told 60 Minutes that any work done “in line with our community guidelines was paid out.” In March, as workers started complaining publicly, Remotasks abruptly shut down in Kenya, locking all workers out of their accounts.
The mental toll of AI training
Workers say some of the projects for Meta and OpenAI also caused them mental harm. Wambalo was assigned to train AI to recognize and weed out pornography, hate speech and excessive violence from social media. He had to sift through the worst of the worst content online for hours on end.
“I looked at people being slaughtered,” Wambalo said. “People engaging in sexual activity with animals. People abusing children physically, sexually. People committing suicide.”
Berhane Gebrekidan thought she’d been hired for a translation job, but she said what she ended up doing was reviewing content featuring dismembered bodies and drone attack victims.
“I find it hard now to even have conversations with people,” she said. “It’s just that I find it easier to cry than to speak.”
Wambalo said the material he had to review online has hurt his marriage.
“After countlessly seeing those sexual activities, pornography on the job, that I was doing, I hate sex,” he said.
SAMA says mental health counseling was provided by “fully-licensed professionals.” Workers say it was woefully inadequate.
“We want psychiatrists,” Wambalo said. “We want psychologists, qualified, who know exactly what we are going through and how they can help us to cope.”
Workers fight back
Wambalo and Berhane Gebrekidan are among around 200 digital workers suing SAMA and Meta over “unreasonable working conditions” that caused them psychological problems.
“It was proven by a psychiatrist that we are thoroughly sick,” Nathan Nkunzimana said “We have gone through a psychiatric evaluation just a few months ago and it was proven that we are all sick, thoroughly sick.”
Wambalo said it’s the responsibility of the big tech companies to know how the jobs are impacting workers.
“They are the ones providing the work,” he said.
Berhane Gebrekidan feels the companies know the people they employ are struggling, but they don’t care.
“…Just because we’re Black, or just because we’re just vulnerable for now, that doesn’t give them the right to just exploit us like this,” she said.
Kenya does have labor laws, but they are outdated and don’t touch on digital labor, Wako-Ojiwa, the civil rights activist, said.
“I do think that our labor laws need to recognize it, but not just in Kenya alone,” Wako-Ojiwa said. “Because what happens is when we start to push back, in terms of protections of workers, a lot of these companies…they shut down and they move to a neighboring country.”
SAMA has terminated the harmful content projects Wambalo and Berhane Gebrekidan were working on. The company would not agree to an on-camera interview and neither would Scale AI, which operated the Remotasks website in Kenya.
Meta and OpenAi told 60 Minutes they’re committed to safe working conditions, including fair wages and access to mental health counseling.
CBS News
Labelers training AI say they’re overworked, underpaid and exploited by big American tech companies
The familiar narrative is that artificial intelligence will take away human jobs: machine-learning will let cars, computers and chatbots teach themselves – making us humans obsolete.
Well, that’s not very likely, and we’re gonna tell you why. There’s a growing global army of millions toiling to make AI run smoothly. They’re called “humans in the loop:” people sorting, labeling, and sifting reams of data to train and improve AI for companies like Meta, OpenAI, Microsoft and Google. It’s gruntwork that needs to be done accurately, fast, and – to do it cheaply – it’s often farmed out to places like Africa –
Naftali Wambalo: The robots or the machines, you are teaching them how to think like human, to do things like human.
We met Naftali Wambalo in Nairobi, Kenya, one of the main hubs for this kind of work. It’s a country desperate for jobs… because of an unemployment rate as high as 67% among young people. So Naftali, father of two, college educated with a degree in mathematics, was elated to finally find work in an emerging field: artificial intelligence.
Lesley Stahl: You were labeling.
Naftali Wambalo: I did labeling for videos and images.
Naftali and digital workers like him, spent eight hours a day in front of a screen studying photos and videos, drawing boxes around objects and labeling them, teaching the AI algorithms to recognize them.
Naftali Wambalo: You’d label, let’s say, furniture in a house. And you say “This is a TV. This is a microwave.” So you are teaching the AI to identify these items. And then there was one for faces of people. The color of the face. “If it looks like this, this is white. If it looks like this, it’s Black. This is Asian.” You’re teaching the AI to identify them automatically.
Humans tag cars and pedestrians to teach autonomous vehicles not to hit them. Humans circle abnormalities to teach AI to recognize diseases. Even as AI is getting smarter, humans in the loop will always be needed because there will always be new devices and inventions that’ll need labeling.
Lesley Stahl: You find these humans in the loop not only here in Kenya but in other countries thousands of miles from Silicon Valley. In India, the Philippines, Venezuela – often countries with large low wage populations – well educated but unemployed.
Nerima Wako-Ojiwa: Honestly, it’s like modern-day slavery. Because it’s cheap labor–
Lesley Stahl: Whoa. What do you –
Nerima Wako-Ojiwa: It’s cheap labor.
Like modern day slavery, says Nerima Wako-Ojiwa, a Kenyan civil rights activist, because big American tech companies come here and advertise the jobs as a ticket to the future. But really, she says, it’s exploitation.
Nerima Wako-Ojiwa: What we’re seeing is an inequality.
Lesley Stahl: It sounds so good. An AI job! Is there any job security?
Nerima Wako-Ojiwa: The contracts that we see are very short-term. And I’ve seen people who have contracts that are monthly, some of them weekly, some of them days. Which is ridiculous.
She calls the workspaces AIi sweatshops with computers instead of sewing machines.
Nerima Wako-Ojiwa: I think that we’re so concerned with “creating opportunities,” but we’re not asking, “Are they good opportunities?”
Because every year a million young people enter the job market, the government has been courting tech giants like Microsoft, Google, Apple, and Intel to come here, promoting Kenya’s reputation as the Silicon Savannah: tech savvy and digitally connected.
Nerima Wako-Ojiwa: The president has been really pushing for opportunities in AI –
Lesley Stahl: President?
Nerima Wako-Ojiwa: Yes.
Lesley Stahl: Ruto?
Nerima Wako-Ojiwa: President Ruto. Yes. The president does have to create at least one million jobs a year the minimum. So it’s a very tight position to be in.
To lure the tech giants, Ruto has been offering financial incentives on top of already lax labor laws. but the workers aren’t hired directly by the big companies. They engage outsourcing firms – also mostly American – to hire for them.
Lesley Stahl: There’s a go-between.
Nerima Wako-Ojiwa: Yes.
Lesley Stahl: They hire? They pay.
Nerima Wako-Ojiwa: Uh-huh (affirm). I mean, they hire thousands of people.
Lesley Stahl: And they are protecting the Facebooks from having their names associated with this?
Nerima Wako-Ojiwa: Yes yes yes.
Lesley Stahl: We’re talking about the richest companies on Earth.
Nerima Wako-Ojiwa: Yes. But then they are paying people peanuts.
Lesley Stahl: AI jobs don’t pay much?
Nerima Wako-Ojiwa: They don’t pay well. They do not pay Africans well enough. And the workforce is so large and desperate that they could pay whatever, and have whatever working conditions, and they will have someone who will pick up that job.
Lesley Stahl: So what’s the average pay for these jobs?
Nerima Wako-Ojiwa: It’s about a $1.50, $2 an hour.
Naftali Wambalo: $2 per hour, and that is gross before tax.
Naftali, Nathan, and Fasica were hired by an American outsourcing company called SAMA – that employs over 3,000 workers here and hired for Meta and OpenAI. In documents we obtained, OpenAI agreed to pay SAMA $12.50 an hour per worker, much more than the $2 the workers actually got – though, SAMA says, that’s a fair wage for the region –
Naftali Wambalo: If the big tech companies are going to keep doing this– this business, they have to do it the right way. So it’s not because you realize Kenya’s a third-world country, you say, “This job I would normally pay $30 in U.S., but because you are Kenya $2 is enough for you.” That idea has to end.
Lesley Stahl: OK. $2 an hour in Kenya. Is that low, medium? Is it an OK salary?
Fasica: So for me, I was living paycheck to paycheck. And I have saved nothing because it’s not enough.
Lesley Stahl: Is it an insult?
Nathan: It is, of course. It is.
Fasica: It is.
Lesley Stahl: Why did you take the job?
Nathan: I have a family to feed. And instead of staying home, let me just at least have something to do.
And not only did the jobs not pay well – they were draining. They say deadlines were unrealistic, punitive – with often just seconds to complete complicated labeling tasks.
Lesley Stahl: Did you see people who were fired just ’cause they complained?
Fasica: Yes, we were walking on eggshells.
They were all hired per project and say SAMA kept pushing them to complete the work faster than the projects required, an allegation SAMA denies.
Lesley Stahl: Let’s say the contract for a certain job was six months, OK? What if you finished in three months? Does the worker get paid for those extra three months?
Male voice: No –
Fasica: KFC.
Lesley Stahl: What?
Fasica: We used to get KFC and Coca Cola.
Naftali Wambalo: They used to say thank you. They give you a bottle of soda and KFC chicken. Two pieces. And that is it.
Worse yet, workers told us that some of the projects for Meta and OpenAI were grim and caused them harm. Naftali was assigned to train AI to recognize and weed out pornography, hate speech and excessive violence, which meant sifting through the worst of the worst content online for hours on end.
Naftali Wambalo: I looked at people being slaughtered, people engaging in sexual activity with animals. People abusing children physically, sexually. People committing suicide.
Lesley Stahl: All day long?
Naftali Wambalo: Basically- yes, all day long. Eight hours a day, 40 hours a week.
The workers told us they were tricked into this work by ads like this that described these jobs as “call center agents” to “assist our clients’ community and help resolve inquiries empathetically.”
Fasica: I was told I was going to do a translation job.
Lesley Stahl: Exactly what was the job you were doing?
Fasica: I was basically reviewing content which are very graphic, very disturbing contents. I was watching dismembered bodies or drone attack victims. You name it. You know, whenever I talk about this, I still have flashbacks.
Lesley Stahl: Are any of you a different person than they were before you had this job?
Fasica: Yeah. I find it hard now to even have conversations with people. It’s just that I find it easier to cry than to speak.
Nathan: You continue isolating you– yourself from people. You don’t want to socialize with others. It’s you and it’s you alone.
Lesley Stahl: Are you a different person?
Naftali Wambalo: Yeah. I’m a different person. I used to enjoy my marriage, especially when it comes to bedroom fireworks. But after the job I hate sex.
Lesley Stahl: You hated sex?
Naftali Wambalo: After countlessly seeing those sexual activities pornography on the job that I was doing, I hate sex.
SAMA says mental health counseling was provided by quote “fully licensed professionals.” But the workers say it was woefully inadequate.
Naftali Wambalo: We want psychiatrists. We want psychologists, qualified, who know exactly what we are going through and how they can help us to cope.
Lesley Stahl: Trauma experts.
Naftali Wambalo: Yes.
Lesley Stahl: Do you think the big company, Facebook, ChatGPT, do you think they know how this is affecting the workers?
Naftali Wambalo: It’s their job to know. It’s their f***ing job to know, actually– because they are the ones providing the work.
These three and nearly 200 other digital workers are suing SAMA and Meta over “unreasonable working conditions” that caused psychiatric problems
Nathan: It was proven by a psychiatrist that we are thoroughly sick. We have gone through a psychiatric evaluation just a few months ago and it was proven that we are all sick, thoroughly sick.
Fasica: They know that we’re damaged but they don’t care. We’re humans just because we’re black, or just because we’re just vulnerable for now, that doesn’t give them the right to just exploit us like this.
SAMA – which has terminated those projects – would not agree to an on-camera interview. Meta and OpenAI told us they’re committed to safe working conditions including fair wages and access to mental health counseling. Another American AI training company facing criticism in Kenya is Scale AI, which operates a website called Remotasks.
Lesley Stahl: Did you all work for Remotasks?
Group: Yes.
Lesley Stahl: Or work with them?
Ephantus, Joan, Joy, Michael, and Duncan signed up online, creating an account, and clicked for work remotely, getting paid per task. Problem is: sometimes the company just didn’t pay them.
Ephantus: When it gets to the day before payday, they close the account and say that “You violated a policy.”
Lesley Stahl: They say, “You violated their policy.”
Voice: Yes.
Lesley Stahl: And they don’t pay you for the work you’ve done—
Ephantus: They don’t.
Lesley Stahl: Would you say that that’s almost common, that you do work and you’re not paid for it?
Joan: Yeah.
Lesley Stahl: And you have no recourse, you have no way to even complain?
Joan: There’s no way.
The company says any work that was done “in line with our community guidelines was paid out.” In March, as workers started complaining publicly, Remotasks abruptly shut down in Kenya altogether.
Lesley Stahl: There are no labor laws here?
Nerima Wako-Ojiwa: Our labor law is about 20 years old, it doesn’t touch on digital labor. I do think that our labor laws need to recognize it– but not just in Kenya alone. Because what happens is when we start to push back, in terms of protections of workers, a lot of these companies, they shut down and they move to a neighboring country.
Lesley Stahl: It’s easy to see how you’re trapped. Kenya is trapped: They need jobs so desperately that there’s a fear that if you complain, if your government complained, then these companies don’t have to come here.
Nerima Wako-Ojiwa: Yeah. And that’s what they throw at us all the time. And it’s terrible to see just how many American companies are just-just doing wrong here– just doing wrong here. And it’s something that they wouldn’t do at home, so why do it here?
Produced by Shachar Bar-On and Jinsol Jung. Broadcast associate, Aria Een. Edited by April Wilson.