The Conversation You Keep Having
You’ve had this fight. Maybe last night. Maybe this morning. Maybe it’s a slow-rolling cold war that’s been going on since September.
Your kid won’t put the phone down. Dinner’s ready and they don’t move. You say their name three times. You say it a fourth time, louder. They surface like someone being pulled out of a trance — annoyed, disoriented, already drifting back. You feel the frustration rising. You wonder what happened to the sweet six-year-old who used to draw pictures of your dog.
And then comes the thought. The one that settles into the back of your skull like a splinter:
Maybe I failed somewhere. Maybe I didn’t set enough rules. Maybe I gave them the phone too early. Maybe I’m just not good enough at this.
Parenting.
Here’s the thing, though. That guilt you’re carrying around? The one that shows up every time your kid zombies out on TikTok for ninety straight minutes? It’s not entirely yours to carry. Because this isn’t a discipline problem. It’s a product-design problem. And the people who designed these products knew exactly what they were building.
The Take
Stop blaming your kid for being addicted to apps that were deliberately engineered to addict them.
The inability to put the phone down is not a character flaw. It’s the intended outcome of design decisions made by teams of engineers and behavioral psychologists — using the same reward-timing mechanics that make slot machines work — aimed at a brain that is biologically incapable of resisting them the way yours can.
You're not in a fight with your kid over screen time. You're in a fight with a billion-dollar algorithm — and your kid is standing in the middle.
The Argument
Let’s start with how the trick works.
When your kid opens TikTok, they don’t see a list of posts from their friends in chronological order. They see an infinite scroll — a feed with no bottom, no page break, no natural stopping point. That design was introduced in 2006 by a UX designer named Aza Raskin, and its explicit purpose was to eliminate what psychologists call “decision points.” Those are the tiny moments — a page break, a loading screen, the end of a chapter — where your brain checks in and asks, Should I keep going? Infinite scroll removes every single one of those moments. The feed just keeps coming, and your brain never gets the cue to evaluate whether it’s time to stop.
Now layer on the reward system. Every scroll delivers something new. Sometimes it’s hilarious. Sometimes it’s boring. Sometimes it’s a video that feels like it was made specifically for you — because, algorithmically, it was. That unpredictability is the key. It’s called variable-ratio reinforcement, and it’s the exact reward-timing mechanism that makes slot machines the most profitable machines in a casino. You don’t know when the next great video is coming, so your brain keeps you scrolling because it might be the next one. The dopamine hit isn’t even in finding the good video. It’s in anticipating it.
Now here’s the part that turns this from a bad habit into an unfair fight.
The prefrontal cortex — the part of the brain responsible for impulse control, self-regulation, and long-term decision-making — does not fully mature until the mid-twenties. In a teenager, that region is still under active construction. The brain’s reward system, meanwhile, is fully online and running hot. Dopamine levels spike during adolescence and don’t stabilize until young adulthood, which means a teen’s reward circuitry is measurably more reactive to exactly the kind of variable-reward stimuli these apps are designed to deliver.
So when your thirteen-year-old can’t stop scrolling, they’re not choosing to ignore you. Their brain is literally less equipped to stop than yours is. The impulse-control system is still being built. The reward system is wide open. And the app is engineered to exploit exactly that gap.
A three-year study by UNC Chapel Hill tracked 169 middle schoolers and found that teens who checked social media more than fifteen times per day showed measurable changes in brain regions governing emotional response and decision-making. Their brains became increasingly sensitized to social feedback over time. That’s not correlation. That’s a longitudinal study published in JAMA Pediatrics showing the brain physically adapting to the stimulus.
And the platforms knew.
Meta’s own internal research — leaked by whistleblower Frances Haugen in 2021 and expanded through litigation discovery in 2025 — includes over two dozen internal studies. One widely reported finding: Instagram makes body image issues worse for one in three teen girls. Another: seventeen percent of teen girls said their eating disorders got worse after using Instagram. Teens themselves, in Meta’s own research, described their usage patterns the way addicts describe substance use.
Meta saw this data. They published none of it. They implemented few of the safeguards their own researchers recommended. And they kept optimizing the feed.
On March 25, 2026 — two days ago — a California jury looked at this evidence and reached a verdict. In the KGM bellwether trial, the jury found Meta and Google (YouTube) liable for negligence, concluding that both platforms were deliberately designed to be addictive and that executives knew it. The jury awarded $6 million in combined damages. TikTok and Snapchat had already settled before trial.
Six million dollars is pocket change for Meta. But this verdict isn’t about the dollar amount. It’s about what it signals. This was a bellwether trial — a test case selected from over 10,000 individual lawsuits and nearly 800 school district claims pending nationwide. The outcome tells every other jury, every other plaintiff, and every corporate legal team exactly how this evidence lands when real people hear it.
A jury of twelve people heard what these platforms did, and they called it what it is.
The Counterargument
The pushback I hear most often sounds reasonable on the surface: Social media is just the new TV. Every generation panics about the new thing. Our parents said TV would rot our brains, and we turned out fine.
I get it. But the comparison doesn’t hold up.
TV had commercial breaks. Episode endings. A bedtime when the channels went to static. Those were built-in stopping points — moments when the brain could check in and decide to do something else. Social media feeds have none. No bottom. No break. No end.
TV was also the same for everyone. You and your neighbor watched the same show. Social media algorithms learn what makes your specific child unable to stop — what they watch longest, what they replay, what makes them scroll slower — and then they serve more of it. In real time. Continuously. That’s not “the new TV.” That’s a fundamentally different kind of product, and pretending otherwise lets the platforms off the hook.
The other common objection: Just set a time limit. And yes, time limits help. I’ve written extensively about how to configure them. But a time limit doesn’t address the fact that infinite scroll removes stopping cues, autoplay makes watching the default behavior, notifications pull kids back after they’ve stopped, and Snapchat streaks punish them for staying away. You’re setting a speed limit on a road that was built without brakes.
The Close
Fifty-four percent of parents in the U.S. say they feel their child is addicted to screens. Fifty-two percent of kids aged 11 to 18 say they want to use their devices less — but nearly half of them say they don’t know how.
Read those numbers again. The kids want to stop. They can’t. Because the product won’t let them.
So the next time you’re standing in the kitchen, calling your kid’s name for the fourth time, feeling that splinter of guilt pressing deeper — take a breath. This isn’t a failure of parenting. It’s not a failure of your kid. It’s a product working exactly as it was designed to work, on a brain that was never built to resist it.
The fight was never you versus your kid. It was always you and your kid versus the machine. And a jury just said so.
Your kid's screen time problem isn't a discipline problem — it's a design problem. The apps they use were engineered by teams of behavioral psychologists to be as difficult to put down as a slot machine, aimed at a brain that won't have the hardware to fully resist for another decade. A California jury just held Meta and YouTube accountable for exactly this. Maybe it's time we stopped holding our kids accountable for it first.