Social media platforms have to face child safety lawsuits, rules US judge

A federal court in the US has rejected social media giants’ motion to dismiss dozens of lawsuits, accusing them of running platforms “addictive” to kids and allegedly spreading child sex abuse materials.

Meta, ByteDance, Alphabet, and Snap are dealing with dozens of lawsuits that accuse their platforms of allegedly harming children.

US District Judge Yvonne Gonzalez Rogers has rejected their motion, saying that Meta, ByteDance, Alphabet (Google’s parent company), and Snap (Snapchat’s parent company) must proceed with a lawsuit alleging their social platforms have adverse mental health effects on children, reports The Verge.

“Because children still developing impulse control are uniquely susceptible to harms arising out of compulsive use of social media platforms, defendants have ‘created a youth mental health crisis’ through the defective design of their platforms,” according to the ruling.

“Further, these platforms facilitate and contribute to the sexual exploitation and sextortion of children, as well as the ongoing production and spread of child sex abuse materials (CSAM) online. To that end, defendants know that children use their products, both from public and internal data,” it read.

Millions of children use defendants’ platforms “compulsively”.

Many report that they feel they are addicted to the platforms, wish they used them less, and feel harmed by them.

School districts across the US have filed suit against Meta, ByteDance, Alphabet, and Snap, alleging the companies cause physical and emotional harm to children.

Meanwhile, 42 states sued Meta last month over claims Facebook and Instagram “profoundly altered the psychological and social realities of a generation of young Americans”. The court order addresses the individual suits and “over 140 actions” taken against the companies.

A Google spokesperson said the allegations in these complaints are “simply not true”, adding the company has “built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls”.

Snap, Meta, and ByteDance didn’t immediately respond to the ruling.