Audio By Carbonatix
TikTok agreed to settle a landmark social media addiction lawsuit just before the trial kicked off, the plaintiff’s attorneys confirmed.
The social video platform was one of three companies — along with Meta’s Instagram and Google’s YouTube — facing claims that their platforms deliberately addict and harm children.
A fourth company named in the lawsuit, Snapchat parent company Snap Inc., settled the case last week for an undisclosed sum.
Details of the settlement with TikTok were not disclosed, and the company did not immediately respond to a request for comment.
At the core of the case is a 19-year-old identified only by the initials “KGM,” whose case could determine how thousands of other, similar lawsuits against social media companies will play out.
She and two other plaintiffs have been selected for bellwether trials — essentially test cases for both sides to see how their arguments play out before a jury and what damages, if any, may be awarded, said Clay Calvert, a nonresident senior fellow of technology policy studies at the American Enterprise Institute.
Joseph VanZandt, co-lead counsel for the plaintiff, said in a statement Tuesday that TikTok remains a defendant in the other personal injury cases and that the trial against Meta and YouTube will proceed as scheduled.
Jury selection starts this week in the Los Angeles County Superior Court. It’s the first time the companies will argue their case before a jury, and the outcome could have profound effects on their businesses and how they will handle children using their platforms. The selection process is expected to take at least a few days, with 75 potential jurors questioned each day through at least Thursday. A fourth company named in the lawsuit, Snapchat parent company Snap Inc., settled the case last week for an undisclosed sum.
“This was only the first case — there are hundreds of parents and school districts in the social media addiction trials that start today, and sadly, new families every day who are speaking out and bringing Big Tech to court for its deliberately harmful products,” said Sacha Haworth, executive director of the nonprofit Tech Oversight Project.
KGM claims that her use of social media from an early age addicted her to the technology and exacerbated depression and suicidal thoughts. Importantly, the lawsuit claims that this was done through deliberate design choices made by companies that sought to make their platforms more addictive to children to boost profits. This argument, if successful, could sidestep the companies’ First Amendment shield and Section 230, which protects tech companies from liability for material posted on their platforms.
“Borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, Defendants deliberately embedded in their products an array of design features aimed at maximizing youth engagement to drive advertising revenue,” the lawsuit says.
Executives, including Meta CEO Mark Zuckerberg, are expected to testify at the trial, which will last six to eight weeks. Experts have drawn similarities to the Big Tobacco trials that led to a 1998 settlement requiring cigarette companies to pay billions in health care costs and restrict marketing targeting minors.
“Plaintiffs are not merely the collateral damage of Defendants’ products,” the lawsuit says. “They are the direct victims of the intentional product design choices made by each Defendant. They are the intended targets of the harmful features that pushed them into self-destructive feedback loops.”
The tech companies dispute the claims that their products deliberately harm children, citing a bevy of safeguards they have added over the years and arguing that they are not liable for content posted on their sites by third parties.
“Recently, a number of lawsuits have attempted to place the blame for teen mental health struggles squarely on social media companies,” Meta said in a recent blog post. “But this oversimplifies a serious issue. Clinicians and researchers find that mental health is a deeply complex and multifaceted issue, and trends regarding teens’ well-being aren’t clear-cut or universal. Narrowing the challenges faced by teens to a single factor ignores the scientific research and the many stressors impacting young people today, like academic pressure, school safety, socio-economic challenges and substance abuse.”
A Meta spokesperson said in a statement Monday the company strongly disagrees with the allegations outlined in the lawsuit and that it’s “confident the evidence will show our longstanding commitment to supporting young people.”
José Castañeda, a Google Spokesperson, said Monday that the allegations against YouTube are “simply not true.” In a statement, he said “Providing young people with a safer, healthier experience has always been core to our work.”
TikTok did not immediately respond to a request for comment Monday.
The case will be the first in a slew of cases beginning this year that seek to hold social media companies responsible for harming children’s mental well-being. A federal bellwether trial beginning in June in Oakland, California, will be the first to represent school districts that have sued social media platforms over harms to children.
In addition, more than 40 state attorneys general have filed lawsuits against Meta, claiming it is harming young people and contributing to the youth mental health crisis by deliberately designing features on Instagram and Facebook that addict children to its platforms. The majority of cases filed their lawsuits in federal court, but some sued in their respective states.
TikTok also faces similar lawsuits in more than a dozen states.
In New Mexico, meanwhile, jury selection begins next week for trial on allegations that Meta and its social media platforms have failed to protect young users from sexual exploitation, following an undercover online investigation. Attorney General Raúl Torrez in late 2023 sued Meta and Zuckerberg, who was later dropped from the suit.
Prosecutors have said that New Mexico is not seeking to hold Meta accountable for its content but rather its role in pushing out that content through complex algorithms that proliferate material that can be harmful, saying they uncovered internal documents in which Meta employees estimate that about 100,000 children every day are subjected to sexual harassment on the company’s platforms.
Meta has said it uses sophisticated technology, hires child safety experts, reports content to the National Center for Missing and Exploited Children, and shares information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.
Latest Stories
-
TTAG raises alarm over proposed recruitment of 7,000 teachers, demands national posting roadmap
13 minutes -
Civilians feared killed after reports of air strike on Nigerian market
23 minutes -
Bishop Simon Kofi Appiah installed as new Jasikan Diocese Bishop
24 minutes -
Trump’s Strait of Hormuz blockade threat raises risks and leaves predicaments unchanged
27 minutes -
US Court backs extradiction of former MASLOC CEO Sedina Tamakloe-Attionu’s to Ghana
46 minutes -
Seven arrested as NAIMOS dismantles illegal mining camp, seizes firearms at Boin River
47 minutes -
Fire erupts at Madina Ritz Junction, destroys multiple wooden structures and containers
1 hour -
Daniel-Kofi Kyereh returns from long-term injury, registers assist for Freiburg U23
1 hour -
Knifeman calling himself ‘Lucifer’ slashes three at NYC’s Grand Central
1 hour -
Brands are built from within to without
1 hour -
Matriculants urged to pursue excellence as gov’t reaffirms support for Maritime education
2 hours -
See the areas that will be affected by ECG’s planned maintenance on Monday, April 13, 2026
2 hours -
GPL 2025/26: Salim Adams double fires Medeama back to summit after Kotoko rout
2 hours -
Two robbery suspects convicted following violent gold dealer attack in Obuasi
2 hours -
Supreme Court @150: Fanfare meets reflection as nationwide activities roll out
2 hours