Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Meridian raises $17 million to remake the agentic spreadsheet

February 11, 2026

Thai coffee chains cut default sugar content in coffee and tea drinks in a new health push

February 11, 2026

SpaceX Is Leaning Into the Moon. Here’s Why.

February 11, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » Are social media giants like Meta liable for addiction?
Health

Are social media giants like Meta liable for addiction?

IQ TIMES MEDIABy IQ TIMES MEDIAFebruary 2, 2026No Comments13 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


On the Monday, February 2, 2026, episode of The Excerpt podcast: A landmark trial asks whether social media giants like Meta knowingly designed addictive platforms, and if they can be held legally responsible. Clay Calvert, nonresident senior fellow at the American Enterprise Institute, joins USA TODAY’s The Excerpt to break down the case.

Hit play on the player below to hear the podcast and follow along with the transcript beneath it. This transcript was automatically generated, and then edited for clarity in its current form. There may be some differences between the audio and the text.

Podcasts: True crime, in-depth interviews and more USA TODAY podcasts right here

Dana Taylor:

For years now, parents, children, schools, researchers, and lawmakers have raised concerns about how social media affects young people, from mental health to how much time teens spend online. Now those concerns are being tested in court. As part of a series of lawsuits, a jury will for the first time hear arguments that platforms like TikTok, Meta, and YouTube didn’t just host harmful content, but deliberately design their products in ways that made them addictive, especially for young users.

Hello, and welcome to USA TOAY’s The Excerpt. I’m Dana Taylor. Today is Monday, February 2nd, 2026.

While some companies have already settled, others are pressing their case in what legal experts are calling a landmark trial. Will social media be persuaded to change its algorithms? For more on this, I’m joined now by Clay Calvert, non-resident senior fellow at the American Enterprise Institute. Thanks so much for coming on, Clay.

Clay Calvert:

No, thank you very much for having me. I appreciate the opportunity.

Dana Taylor:

While thousands of social media addiction lawsuits have been filed nationwide, courts are only sending a small number to trial this year, the so-called bellwether trials. These are test cases meant to guide how the remaining lawsuits are resolved. The first case is being called the KGM case to maintain the plaintiff’s anonymity. Can you tell me about that case, Clay?

Clay Calvert:

Sure. So KGM, as you said, is the bellwether test case, really, the first of more than 1,000 personal injury lawsuits that essentially allege that social media platforms were designed to hook minors to keep their eyeballs on social media, and that that process has caused them multiple harms.

In KGM’s case, she’s suggesting anxiety, depression, body dysmorphia, and other injuries. So her case is the first one up. She’s a young adult now, but she claims that when she was very young, she started watching on YouTube videos, and then basically later used Instagram, ByteDance, and the other platforms.

Now, in this particular case, what’s important is that ByteDance, which is TikTok and Snapchat, Snap, have settled. So all we have left are really the two biggest companies as we would think of them. We have Meta, which is Instagram as a defendant, and we have Google, which is YouTube as a defendant. So those are the two defendants that are in this case that KGM is suing.

Dana Taylor:

The courts have dismissed many similar lawsuits in the past. What legal arguments or evidence here convinced a judge to let these cases proceed?

Clay Calvert:

That’s a really good question. There’s a key question here, and this is what she said, “I’m going to leave it to the jury. I’m not going to decide this for myself.” And the question really is what caused the harm that KGM says she’s suffering?

Is it the content of the videos and the posts that she has seen and watched on social media platforms? Or is it defects, alleged defects in the design of the platforms themselves? And by that, we’re talking about things like endless scroll, notifications, algorithmic delivery of content, image filters, auto-play. Those are the types of characteristics or features that she’s claiming are defective and designed to hook minors.

So why is that difference important? Here’s why it’s important. If the harm was caused by the speech itself and not the alleged defects or features of the platforms, then something called Section 230 immunity kicks in.

Section 230 is a federal statute that was adopted 30 years ago back in 1996. Essentially what it does is it says that social media platforms are immune from civil liability for what we call third party content. That means user generated content or content posted by others.

So if it’s the content of others that she watched and caused her harm, she saw videos of people who were abnormally thin, whatever it is she’s going to say in this case. If it was the content that caused her the harm, then section 230, federal statute will kick in and provide immunity from civil liability for Google and Meta in this case.

On the other hand, if it’s the features, those design features that cause the harm, and that’s what she’s arguing it is, then there’s a possibility that indeed Google and Meta will be held liable here for causing her the harm.

So the jury’s going to have a really difficult case. How do you parse out what really caused the harm? And so causation is going to be a key question in these cases and in hers in particular.

Dana Taylor:

Clay, how significant is this shift in theory for product liability law?

Clay Calvert:

That’s very important here. Essentially, what the plaintiffs are attempting to do is plead around Section 230. Section 230 has proven to be a very formidable barrier in terms of protecting social media platforms for liability based on the content of others posted.

So what plaintiff’s attorneys have done here in this case and in other cases, they’ve tried to find end runs or theories around that. So by focusing on the product itself, and there’s a question really here, are social media platforms really products or are they services?

And what the judge in Los Angeles has done is she says, “We’re going to allow negligence causes of action to go forward.” That these platforms were negligently designed. They breached a duty of care, the platforms did in designing them, and that breach of the duty of the care caused the minors to suffer harm. And she’s also allowing a cause of action called negligent failure to warn, that social media platforms had a duty to warn minors of the potential harms.

So it’s very important that she is focusing, as you said, on the defects essentially of the services or products in question rather than on the content. And she’s let the jury going to decide that question.

Dana Taylor:

Plaintiffs are borrowing language and strategy from tobacco and opioid litigation. From a legal historian’s perspective, where do those comparisons stand firm and where do they break down?

Clay Calvert:

So I think they stand firm in the idea that the plaintiff’s attorneys in these cases really see a potential money train rolling through town. And the hope of large massive settlements coming from opioid types of cases, as you suggested, JUUL vaping types of cases, and cigarette tobacco litigation. So I think that’s one thing we see.

Those of course are products that are ingested that cause alleged addiction. This is a little bit different. Nobody ingests social media. So really what we’re dealing here differently is what we might think of as behavioral addiction. Essentially, that’s the allegation that’s being made.

But where I think the analogy really breaks down is that social media platforms are speech services or speech products. They convey speech and speech is protected in the United States by the First Amendment, as long as it doesn’t fall into an unprotected category like child pornography or obscenity that we’re not talking about in this case.

Meanwhile, tobacco, cigarettes, vaping, opioids, those don’t have any First Amendment based speech protection. So the analogy holds true for part, but not on the big picture because really what’s at issue here is speech, and that’s protected by the First Amendment. Cigarettes aren’t First Amendment products protected by that. So I think that’s where that analogy kind of breaks down a bit.

Dana Taylor:

You touched on this, but I want to make sure that I’m clear. One of the biggest hurdles for plaintiffs here is causation. What do they have to prove to convince a jury that platform design directly contributed to harming a user?

Clay Calvert:

And that’s a did it directly contribute or was it a substantial factor in causing the harm? And so there’s going to be fights really, I think, over which platforms were watched? For how long were they watched? Because we’re going to have, as I said, we’ve got two different defendants now left in this case. We’ve got Meta and Google. We’ve got YouTube and Instagram as the platforms in question.

There’s also a question of preexisting injury and other contributing factors that may have caused KGM problems. We don’t know. But I suspect one argument that defendants might say is that she had other issues already before she watched social media platforms. Or things, think about like the COVID pandemic, people, minors, especially stressed out at a stressful time. That there may be multiple variables or factors that intervene or contribute to or cause the injuries that she says she’s suffering. So I think that that’s really a great issue or important issue, as you pointed out, is causation.

So it’s going to be tough for the plaintiffs to do. We’re going to see battles of the experts on causation, and that always gets tricky. Both sides will have experts saying that this caused it or this did not cause it. And then that’s going to be possibly confusing for jurors.

Jurors have to weigh the credibility of that evidence and say, “Okay, I find this expert credible. I don’t find that expert credible. I find KGM credible. I don’t find Zuckerberg credible.” Whatever it is. And that’s going to be difficult for them to parse all that out.

Dana Taylor:

As you mentioned, two companies, Snap and TikTok chose to settle this first case rather than go to trial. Legally speaking, Clay, how did those settlements shape the remaining cases even if they don’t create precedent on paper?

Clay Calvert:

That’s a really good point. I don’t think they shaped the remaining cases. So it’s important to note that they really just settled this case, the bellwether case, the test case, KGM’s case. None of the other thousands of cases, literally. They don’t affect that case.

And we don’t know why Snap or TikTok, why they decided to settle this case. Was there something unique about it? That they figured, “Well, this is one we don’t want to get involved with. Or maybe this is an easy one just to peel off here and settle.” So their settlements, this is simply one case.

They’re not settling all of these. We call it a consolidated joint proceeding. They’re not settling all of the cases, but merely the first one. So I don’t think it really has much precedential value. And again, we don’t really know why. And so that’s always would be interesting to figure that out, but we just can’t do that. And I really don’t want to speculate about why.

Dana Taylor:

Social media companies warn that these cases could weaken longstanding legal protections for their platforms. Meta CEO, Mark Zuckerberg, and Instagram CEO, Adam Mosseri could both be called to testify. What else are they saying?

Clay Calvert:

This is going to be really huge, isn’t it? We’ve seen Mark Zuckerberg been testifying, been called to testify in front of Congress. This happened a couple years ago. They call him up, they beat him up, right? They make him say things.

This is going to be big because in a way, he’s a rockstar who’s testifying now. And so how the jurors take that will be really interesting. It’s probably going to be hard to find somebody who has not heard of Mark Zuckerberg to be on that jury, but that’s not really the test. It’s can they set aside their beliefs about him, what they know about him?

But his testimony, I would think, would be very powerful and his credibility for them and how the jurors perceive him. Now, the judge said that he can be compelled to testify in this particular bellwether case, the KGM case.

But she also, Judge Kuhl, also made it very clear that compelling him to testify in all the thousands of cases would probably be unduly burdensome as the CEO and founder of Meta. So whatever he says in this case then would clearly have implications, I think, for the other cases later. On because it would be, “Well, you said in the KGM case this about Meta’s safety practices and all the efforts that Meta and Instagram tried to take to protect minors, but how does that apply in this later case?”

So his testimony will be very important and it will have legs beyond this particular case because as I suggested, he’s not going to be testifying in all of the cases.

Dana Taylor:

Finally, is there a chance this trial could open the door to broader regulation through the courts, effectively doing what Congress has struggled or some might say failed to do legislatively?

Clay Calvert:

That’s a good point because we think about two different fronts. We have litigation and legislation. How could people go after social media platforms if they want? Do they litigate or do they legislate? I think if there are a series of plaintiff verdicts here, and of course these will be appealed, so that will be a long appellate road process.

But if there are a series of plaintiff verdicts here and they’re upheld on appeal, this will be huge. It could open the floodgates to more litigation. Although of course right now we have thousands of cases already pending. There are cases filed by public school districts, there are cases filed by the attorneys general of states, there are cases filed by local governments. We’ve also seen they’re kind of adjacent to these, but cases saying that chatbots have caused harm to minors or caused minors to commit suicide.

So I think we’re really at a pivotal kind of inflection point right now where all of these cases are rising up and converging at one time. So it’s really an important moment in time in terms of free speech for social media platforms, free speech for minors to access it. And on the other hand, protecting minors from harms that social media platforms might cause.

Dana Taylor:

Clay, thank you so much for joining me on The Excerpt.

Clay Calvert:

Thank you very much for having me. I appreciate the opportunity.

Dana Taylor:

Thanks to our senior producer, Kaely Monahan, for her production assistance. Our executive producer is Laura Beatty. Let us know what you think of this episode by sending a note to podcasts at usatoday.com. Thanks for listening. I’m Dana Taylor. I’ll be back tomorrow morning with another episode of USA TOAY’s The Excerpt.

This article originally appeared on USA TODAY: Can Meta be held liable for addictive algorithms? | The Excerpt



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

Thai coffee chains cut default sugar content in coffee and tea drinks in a new health push

February 11, 2026

Moderna says FDA refuses to review its application for flu vaccine

February 11, 2026

California Health Department warns of growing measles cases

February 11, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Gunman apprehended in southern Thailand after holding students and teachers hostage in school

February 11, 2026

Senegal youth say hope for change ends with protester death

February 11, 2026

San Francisco parents juggle work and kids amid teachers strike

February 10, 2026

Butler’s University’s new Deaf education curriculum draws concern

February 9, 2026
Education

Gunman apprehended in southern Thailand after holding students and teachers hostage in school

By IQ TIMES MEDIAFebruary 11, 20260

HAT YAI, Thailand (AP) — A hostage situation and a shooting were reported Wednesday inside…

Senegal youth say hope for change ends with protester death

February 11, 2026

San Francisco parents juggle work and kids amid teachers strike

February 10, 2026

Butler’s University’s new Deaf education curriculum draws concern

February 9, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.