Is this social media’s ‘tobacco’ moment? | Science, Climate & Tech News


Why could what happened to tobacco companies come back to haunt social media giants too?

In the 1990s, tobacco firms faced a series of trials that ultimately held them accountable for selling a product they knew was harmful – they became known as the “tobacco trials”.

Three decades on, landmark trials are taking place in the US, where lawyers representing people allegedly harmed by social media are trying a new line of attack against tech companies.

These trials focus on how platforms are designed – not on what is posted on them.


Prince Harry fights back tears as he speaks to bereaved parents

That means the tech companies can’t rely on Section 230 of the Communications Act, a piece of law that protects them from liability over user content. That defence has seen a lot of cases fail in the past.

There are currently more than 2,000 active cases looking at social media harm in the US. Here are three of those cases that are particularly significant.

Bereaved parents created a memorial screen honouring their children near the LA courthouse
Image:
Bereaved parents created a memorial screen honouring their children near the LA courthouse

LA social media trial – featuring Mark Zuckerberg

Thousands of people have filed lawsuits accusing TikTok, Meta, Snapchat, and YouTube of designing addictive, harmful platforms.

Now, lawyers have bundled those cases together and picked what they believe are the strongest to take to court.


Social media on trial: Here’s why it matters

The first one, happening at the moment, focuses on a 20-year-old Californian known as KGM.

She says addictive features on social media like infinite scrolling and photo filters gave her anxiety, depression, and body image issues.

Snapchat and TikTok settled out of court before the proceedings started, but YouTube and Meta are fighting back.

In his testimony on Wednesday, Meta CEO Zuckerberg insisted the company’s philosophy has always been “to try to build useful services that people connect to” and claimed it did not have targeted internal goals for how long people should use the apps.


Zuckerberg in court: ‘Incredibly nervous’ but holding up

Zuckerberg added: “If something is valuable, people will do it more because it’s useful to them.”

He also said that Meta had worked to remove underage accounts, and any implication to the contrary was “not true”.

He did, however, admit that “I don’t think we identified every single person who tried to get around restrictions”.

Zuckerberg later apologised to the families present at the trial: “I’m sorry for everything you have all been through.”

Instagram boss Adam Mosseri said in his testimony last week that he does not believe people can get clinically addicted to social media. Instead, he said there can be “problematic use”.

Bereaved parents protest the deaths of their children outside the courthouse in LA. Pic: AP
Image:
Bereaved parents protest the deaths of their children outside the courthouse in LA. Pic: AP

If KGM’s lawyers can convince the jury that those social media features were damaging to her mental health, this court case lays the groundwork for how much compensation social media victims should get.

But, it could also see the companies forced to make changes to their platforms to stop them from being addictive or damaging.

Meta told Sky News: “The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles.

“The evidence will show she faced many significant, difficult challenges well before she ever used social media.”

British parents suing over TikTok’s algorithm

The second case to watch is being brought by five British parents.

They’re suing TikTok in Delaware over the deaths of their children, who they say all died attempting the dangerous ‘blackout’ challenge they saw on the platform.


Bereaved parents taking TikTok to court

Speaking to Sky News last month, the parents all described cheerful, happy children who had shown no mental health issues before their deaths.

“How the hell do you, as a parent, get your head around that?” asked Lisa Kenevan, whose son, Isaac, died when he was 13.

However, these parents aren’t suing over the videos themselves, but over what they describe as a harmful algorithm that “flooded them with a seemingly endless stream of harms”.

TikTok is contesting these claims, and told us: “Our deepest sympathies remain with these families.

“We strictly prohibit content that promotes or encourages dangerous behaviour.

“Using robust detection systems and dedicated enforcement teams to proactively identify and remove this content, we remove 99% that’s found to break these rules before it is reported to us.”

If the parents are successful, TikTok could be forced to change how its algorithm works – especially for young users – to prevent harmful content being promoted.

The case is still in the early stages, although we’re expecting an update before mid-April.

Scottish family taking action over sextortion

The third case involves the family of 16-year-old Murray Dowey, from Scotland.

Murray Dowey. Pic: Family handout
Image:
Murray Dowey. Pic: Family handout

They’re suing Meta after he took his own life while being blackmailed on Instagram by sextortionists. They’re joined by a US mother, whose son, Levi, died in similar circumstances.

It’s the first UK case where a social media company is facing legal action over sextortion on its platform. In the past, cases have focused on the perpetrators.

If it’s successful, Meta may have to strengthen protections for young users. While under-16s have restricted accounts by default, Murray’s family say older teenagers are still vulnerable.

The lawsuit also challenges Meta’s data collection practices, arguing Instagram’s recommendation systems helped sextortionists target him.

Read more from Sky News:
Do AI resignations leave the world in ‘peril’?
Can governments ever keep up with big tech?

Meta is contesting these claims.

A spokesperson told Sky News: “Since 2021, we’ve placed teens under 16 into private accounts when they sign up for Instagram, which means they have to approve any new followers.

“We work to prevent accounts showing suspicious behaviour from following teens and avoid recommending teens to them.

“We also take other precautionary steps, like blurring potentially sensitive images sent in DMs and reminding teens of the risks of sharing them, and letting people know when they’re chatting to someone who may be in a different country.”

Social media companies are being bombarded with litigation – but if just one of these cases succeeds, we could see a huge difference in the way they approach online safety.



Source link