Story Snapshot
What Happened
A jury found Meta and YouTube liable for addictive design that harmed a young woman who started using their platforms as a child, awarding $6 million in damages.
Who It Affects
Any family with a child or teen using Instagram, YouTube, TikTok, or Snapchat — and the 2,400+ families with similar cases already filed.
Why It Matters
For the first time, a jury treated social media apps the way courts treat faulty brakes or toxic chemicals — as defective products. That legal playbook now applies to thousands of pending cases.

Not the Dollar Amount. The Door It Opens.

Six million dollars.

That’s what a Los Angeles jury decided Meta and YouTube owe a young woman named Kaley — now 20 — who started using YouTube at age 6 and Instagram at age 9. She developed depression, body dysmorphia, anxiety, and self-harm behaviors. She told the court she was on social media “all day long” as a child and that she didn’t experience body dysmorphia before using beauty filters.

Six million dollars sounds like a lot until you remember that Meta is worth roughly $1.5 trillion. That’s like fining someone with a million-dollar salary about four-tenths of a penny. Meta’s stock actually went up after the verdict.

So why does this matter?

Because the money was never the point. The point is that for the first time in history, a jury looked at Instagram and YouTube — not at anything anyone posted on them, but at how the apps themselves are built — and said: these are defective products.

That distinction just cracked open a door that 2,400 other lawsuits are about to walk through.

What Happened

On March 25, 2026, a jury in Los Angeles found Meta (Instagram) and Google (YouTube) negligent and liable for harming K.G.M. — the plaintiff identified as Kaley, from Chico, California. The jury awarded $3 million in compensatory damages and $3 million in punitive damages. Meta was assigned 70% of the liability. YouTube got 30%.

The jury deliberated for nine days — over 43 hours — and returned a 10-2 verdict on every question. They found both companies acted with “malice, oppression, or fraud.” That last part matters. Punitive damages aren’t just compensation for what happened to Kaley. They’re the jury saying: you knew what you were doing, and you did it anyway.

Some Context

This wasn't a lawsuit about a specific post or video. The entire legal strategy was built on product liability — the same framework used to sue car companies over faulty brakes and pharmaceutical companies over dangerous drugs. The argument: features like infinite scroll, autoplay, beauty filters, push notifications, and algorithmic recommendations are defective product features, not protected speech. Multiple courts have now ruled that Section 230 — the 1996 law that historically shielded tech companies from lawsuits — does not apply to claims about platform design.

The specific features the jury found harmful read like a checklist of everything that makes these apps hard to put down: infinite scroll with no natural stopping point, autoplay that queues the next video before you’ve finished deciding whether to watch it, cosmetic filters that digitally reshape your face in real time, push notifications timed to pull you back when your attention drifts, “likes” and variable-reward feedback loops that work on the same principle as slot machines, and algorithmic recommendation engines that learn what keeps you scrolling and serve you more of it.

Kaley’s attorney, Mark Lanier — the same lawyer who won multibillion-dollar verdicts in the Johnson & Johnson baby powder litigation — described the platforms as a “pocket-sized casino.” He conceded the $6 million award was lower than he’d hoped but noted that early bellwether plaintiffs in mass tort cases (like tobacco) often win modest amounts before the real momentum builds. Easier to defend on appeal, he said. Less flashy. More durable.

And this wasn’t even the only verdict this week. The day before the LA decision, a New Mexico jury ordered Meta to pay $375 million for violating state consumer protection laws and failing to protect children from sexual exploitation on Facebook and Instagram.

That case originated from a 2023 undercover operation where fake profiles of 13-year-olds were flooded with sexually explicit material. A second phase begins May 4, where New Mexico will ask the court to order actual design changes — real age verification, algorithm modifications, and an independent monitor.

Two verdicts. Two days. Two different legal theories. Both landing on the same conclusion: Meta knew, and Meta didn’t fix it.

What Came Out at Trial

The internal documents revealed during the trial are the part that should make parents pay attention. Not because they’re surprising — most of us already suspected this — but because now it’s on the record, under oath, with Meta’s own words.

An internal Meta document stated: “If we wanna win big with teens, we must bring them in as tweens.” A separate 2020 internal study showed that 11-year-olds were four times as likely to keep coming back to Meta’s apps compared to older users. Internal engagement targets set by Meta called for users to spend 40 minutes per day on the platform in 2023, rising to 42 in 2024, 44 in 2025, and 46 in 2026. Every year, the goal goes up.

Mark Zuckerberg testified in person — his first-ever jury testimony. He was confronted with a 2015 internal review estimating that more than 4 million children under 13 were using Instagram, despite the platform’s stated minimum age of 13. His response: “I always wish that we could have gotten there sooner.”

Former Meta public policy head Nick Clegg’s internal email was read aloud to the jury: “The fact that we say we don’t allow under-13s on our platform, yet have no way of enforcing it, is just indefensible.”

Four million kids under 13 on a platform that requires you to be 13. And the company's own head of public policy called it "indefensible." Under oath.

When confronted with evidence that 18 external experts Meta had consulted raised concerns about beauty filters causing harm to teens, Zuckerberg testified he had a “high bar” for restricting features that limit expression. The filters, he said, “aren’t massively popular features.” They remain available on Instagram today. Users just have to search for them now instead of having them recommended.

Instagram head Adam Mosseri testified that while “problematic use” of Instagram is “real,” he disagreed with the term “addiction.”

Helpful.

What It Actually Means

For parents, the implications land in two places: what the legal system is doing, and what you can actually do right now. They’re very different conversations.

What to Know Right Now
⚖️
This verdict is a template, not a conclusion.

There are approximately 2,407 pending lawsuits in the federal MDL (that's "multidistrict litigation" — thousands of similar cases grouped under one judge). More than 10,000 individual injury cases, nearly 800 school district lawsuits, and actions from 41+ state attorneys general. The legal strategy that won — suing over design, not content — is the playbook all of them will follow.

📅
Federal trials start this summer.

Six school district cases from Maryland, Georgia, Kentucky, New Jersey, South Carolina, and Arizona are selected as federal bellwethers. A second California state trial (R.K.C. v. Meta) is also scheduled for summer 2026. California AG Rob Bonta has his own trial set for August.

🏛️
Congress is trying — but stalling.

Multiple children's online safety bills are pending: KOSA, COPPA 2.0, the KIDS Act. A House committee held a markup on March 5, but bipartisan agreement reportedly broke down. At the state level, over 45 states have children's online safety legislation in various stages. Some are already in effect — Florida now requires age verification for under-14s, and Utah's app store accountability law kicks in May 2026.

📱
Platform changes exist but have real gaps.

Instagram's "Teen Accounts" add meaningful defaults — private by default, PG-13 content filtering, 60-minute daily reminders for under-16s. But parents still can't read DMs, can't control who their teen follows, can't fully disable algorithmic recommendations, and can't block beauty filters entirely. And the age-gating still relies on kids being honest about their birthday — the same system the trial just exposed as having 4 million known underage users.

The school district cases are worth watching separately. When individual families sue, they’re seeking compensation for personal harm. When school districts sue, they’re claiming economic costs — counseling staff, special education resources, safety infrastructure they had to build because of the mental health crisis these platforms contributed to. Those damages could be dramatically larger.

Both Meta and Google have stated they will appeal the LA verdict. Meta’s statement: the issue is “profoundly complex” and can’t be tied to a single app. Google’s defense: YouTube is “a responsibly built streaming platform, not a social media site.” The jury wasn’t persuaded by either argument, but the appeals process could take years.

BY THE WAY

TikTok and Snapchat were originally co-defendants in this case. Both settled with Kaley before the trial began — Snapchat around January 22, TikTok on January 27 (the day jury selection started). Settlement amounts are undisclosed and aren't admissions of liability. But both companies remain defendants in other pending cases. Settling before a jury hears your internal documents is… a choice..

What to Watch For

Three things will determine whether this verdict becomes a turning point or a footnote.

The New Mexico Phase 2 trial, starting May 4. This is potentially more consequential than any dollar amount. New Mexico is seeking injunctive relief — meaning a court could order Meta to implement real age verification, change its algorithms, and submit to an independent monitor. If that happens, it’s not just a payout. It’s a redesign mandate. Ordered by a judge. With oversight.

The federal bellwether trials this summer. Six school district cases go before juries. The legal theory is the same one that just won in LA, but the scale of claimed damages is entirely different. School districts are talking about systemic costs, not individual harm. A string of plaintiff wins at the federal level would create enormous pressure to settle the remaining 2,400+ cases — the same pattern that eventually produced the $200+ billion tobacco settlement.

Whether Congress manages to pass anything. The KIDS Act, which folds KOSA provisions into a broader package with age verification and chatbot restrictions, is the most comprehensive bill on the table. But bipartisan cooperation fractured at the March 5 markup. State legislatures are moving faster — California, Florida, Utah, Minnesota, and New York all have laws in effect or taking effect soon. The question is whether a patchwork of state laws creates enough pressure for a federal standard, or whether Congress continues to hold hearings, express concern, and pass nothing.

The financial penalties so far aren’t hurting these companies. The precedent and potential design mandates are the real threat.

The TecKno Take

A jury said what a lot of parents already knew: these apps aren’t just distracting. They’re designed that way. And for the first time, a court treated that design like a defect — not a feature.

The appeals will take years. Congress is still “working on it.”

But “we didn’t know” doesn’t really work anymore. Not after today.