A jury in California has just handed down a verdict that could change the internet as we know it. In a landmark decision reached on 25 March 2026, tech giants Meta and Google were found liable for deliberately designing their platforms to be addictive, directly contributing to the mental health crisis of a young user. This wasn't just a slap on the wrist or a regulatory fine; it was a $3 million compensatory payout that signals a massive shift in how we hold Silicon Valley accountable for the algorithms they unleash on our kids.
The case centred on a 20-year-old woman identified as Kaley (KGM) from Chico, California. Her story is one that will sound hauntingly familiar to millions of parents across the UK and the world. She started using YouTube at just six years old. By the time she was 11, she was on Instagram. Over the years, her legal team argued, these platforms didn't just host content: they actively engineered a cycle of addiction that led to severe depression and anxiety. For the first time, a jury has looked at the code, the internal memos, and the psychological impact, and they’ve decided that the "black box" algorithms are the responsibility of the people who built them.
The blueprint of a digital trap
The core of the case rested on the idea that these platforms aren't passive tools. Kaley’s legal team managed to get their hands on internal documents that showed exactly what goes on behind the scenes at Menlo Park and Mountain View. One particularly damning internal Meta memo revealed that 11-year-olds were four times more likely to continue using Instagram compared to competing apps, even though the platform officially requires users to be at least 13. The jury heard how these companies didn't just ignore young users; they actively studied how to keep them "hooked" through features designed to trigger dopamine hits.
It is a gritty reality that we’ve all suspected but rarely seen proven in a court of law. The argument wasn't just about "screen time." It was about the deliberate engineering of features like infinite scroll, push notifications, and AI-driven recommendations that are specifically calibrated to exploit human psychology. For a child whose brain is still developing, these features act like a digital slot machine. The jury found that both Meta and Google failed in their duty of care, specifically citing negligence and a "failure to warn" users about the addictive nature of their products. By the time a child realizes they are addicted, the damage is already done.
Meta was hit the hardest in this initial phase, ordered to pay 70% of the $3 million damages, while Google’s YouTube was found responsible for the remaining 30%. But the money isn't the real story here. The real story is the precedent. For years, tech companies have hidden behind Section 230 in the US and similar "platform not publisher" protections elsewhere, arguing they aren't responsible for how people use their sites. This verdict cuts right through that defence, focusing on the product design itself rather than the content.
A defence of home life and data gaps
Of course, the tech titans didn't go down without a fight. Their legal teams leaned heavily on the complexity of mental health. They argued that social media is just one small part of a person’s life and that it is impossible to pin depression or anxiety on a single app. In Kaley’s case, they pointed to a history of emotional and physical abuse at home, suggesting that her struggles were the result of her environment rather than her smartphone. They also noted that her own therapist had never explicitly documented social media as a primary cause of her mental health issues during her sessions.
The defence strategy was clear: muddy the waters. They argued there is no definitive scientific proof that social media causes mental health problems, only correlations. It’s a line we’ve heard before, reminiscent of the big tobacco trials of the 20th century. But this time, the jury wasn't buying it. While they acknowledged the complexities of Kaley’s personal life, they ultimately decided that the addictive design of the platforms was a substantial factor in her decline. The negligence wasn't in the existence of the apps, but in the lack of transparency about how they were built to manipulate the user's attention.
The timing of this verdict adds another layer of pressure to the industry. Just one day before this decision, a separate jury in New Mexico ordered Meta to pay a staggering $375 million for failing to protect young users from child predators. When you combine these two outcomes, it paints a picture of a legal system that is finally catching up to the digital Wild West. The companies have already stated they plan to appeal, but the momentum is clearly swinging towards the victims and their families. This isn't just about one girl in California anymore; it’s about a global industry that has operated without real consequences for decades.
The ripple effect across the globe
This trial was what lawyers call a "bellwether" case. It’s the first of its kind to go to a jury, and it serves as a test for approximately 2,000 other pending lawsuits brought by parents and school districts across the globe. If this verdict stands on appeal, it opens the floodgates. We are looking at a future where every "like" button, every auto-play video, and every algorithmic feed could be subject to product liability laws. It forces companies to decide: do they keep the addictive features and risk billions in damages, or do they fundamentally redesign the internet to be safer?
The next phase of this specific trial will be even more intense. The jury is set to consider punitive damages: money meant not just to compensate the victim, but to punish the companies for their conduct. This is where the numbers could get truly eye-watering. If the jury decides that Meta and Google acted with "malice or gross negligence," the $3 million payout could look like pocket change compared to what’s coming. It’s a wake-up call for shareholders and executives who have prioritised "engagement metrics" over user safety for far too long.
For those of us watching from the UK, the implications are just as significant. As we move forward with our own Online Safety Act and push for tighter regulations on how tech companies operate, this California verdict provides a roadmap. It shows that the "addiction by design" argument is legally sound and that juries are sympathetic to the plight of the "scroll-addicted" generation. The era of tech giants doing whatever they want and asking for forgiveness later is coming to a very expensive end. The real story isn't the $3 million; it's the fact that the curtain has finally been pulled back on the mechanics of digital addiction, and the world doesn't like what it sees.