Hundreds of families sue 'harmful' Big Tech firms

November 20, 2023
Taylor Little described the impact of viewing material related to body image and eating disorders
Taylor Little described the impact of viewing material related to body image and eating disorders

WASHINGTON — Hundreds of families are suing some of the world's biggest technology companies — who, they say, knowingly expose children to harmful products.

One plaintiff explains why they are taking on the might of Silicon Valley.

"I literally was trapped by addiction at age 12. And I did not get my life back for all of my teenage years."

Taylor Little's addiction was social media, an addiction that led to suicide attempts and years of depression.

Taylor, who's now 21 and uses the pronoun "they", describes the tech companies as "big, bad monsters".

The companies, Taylor believes, knowingly put into children's hands highly addictive and damaging products.

Which is why Taylor and hundreds of other American families are suing four of the biggest tech companies in the world.

The lawsuit against Meta — the owner of Facebook and Instagram — plus TikTok, Google and Snap Inc, the owner of Snapchat, is one of the largest ever mounted in Silicon Valley.

The plaintiffs include ordinary families and school districts from across the US.

They claim that the platforms are harmful by design.

Lawyers for the families believe the case of 14-year-old British schoolgirl Molly Russell is an important example of the potential harms faced by teenagers.

Last year they monitored the inquest into her death via video link from Washington, looking for any evidence which they could use in the US lawsuit.

Molly's name is mentioned a dozen times in the master complaint submitted to the court in California.

Last week, the families in the case received a powerful boost when a federal judge ruled that the companies could not use the First Amendment of the US constitution, which protects freedom of speech, to block the action.

Judge Gonzalez Rogers also ruled that S230 of the Communications Decency Act, which states that platforms are not publishers, did not give the companies blanket protection.

The judge ruled that, for example, a lack of "robust" age verification and poor parental controls, as the families argue, are not issues of freedom of expression.

Lawyers for the families called it a "significant victory".

The companies say the claims are not true and they intend to defend themselves robustly.

Taylor, who lives in Colorado, tells us that before getting their first smartphone, they were sporty and outgoing, taking part in dance and theatre.

"If I had my phone taken away, it felt like having withdrawals. It was unbearable. Literally, when I say it was addictive, I don't mean it was habit-forming. I mean, my body and mind craved that."

Taylor remembers the very first social media notification they clicked on.

It was someone's personal self-harm page, showing graphic images of wounds and cuts.

"As an 11-year-old, I clicked on a page and was shown that with no warning. No, I didn't look for it. I didn't ask for it. I can still see it. I'm 21 years old, I can still see it."

Taylor also struggled with content around body image and eating disorders.

"That was — is — like a cult. It felt like a cult. You're constantly bombarded with photographs of a body that you can't have without dying.

"You can't escape that."

Lawyers for Taylor and the other plaintiffs have taken a novel approach to the litigation, focusing on the design of the platforms and not individual posts, comments or images.

They claim the apps contain design features which cause addiction and harm.

Meta released a statement saying: "Our thoughts are with the families represented in these complaints.

"We want to reassure every parent that we have their interests at heart in the work we are doing to provide teens with safe, supportive experiences online."

TikTok declined to comment.

Google told us: "The allegations in these complaints are simply not true. Protecting kids across our platforms has always been core to our work."

And Snapchat said its platform "was designed to remove the pressure to be perfect. We vet all content before it can reach a large audience to prevent the spread of anything that could be harmful."

Taylor knows all about the story of Molly Russell, from north-west London, who took her own life after being exposed to a stream of negative, depressing content on Instagram.

An inquest into her death found she died "while suffering from depression and the negative effects of online content".

Taylor says their stories are very similar.

"I feel incredibly lucky to have survived. And my heart breaks in ways I can't put into words for people like Molly.

"I'm happy. I really love my life. I'm in a place I didn't think I would live to."

It makes Taylor determined to see the legal action through.

"They know we're dying. They don't care. They make money off us dying.

"All hope I have for better social media is entirely dependent on us winning and forcing them to make it — because they will never, ever, ever choose to." — BBC

November 20, 2023
3 days ago

Apple brings ChatGPT to iPhones in AI overhaul

63 days ago

How AI is helping to prevent future power cuts

72 days ago

Microsoft: Technology giant splits Teams and Office globally