Parents suing TikTok over children’s deaths say it ‘has no compassion’

Parents suing TikTok over children’s deaths say it ‘has no compassion’

Laura KuenssbergPresenter, Sunday with Laura Kuenssberg

BBC From left to right: Parents Hollie Dance, Lisa Kenevan, Liam Walsh and Ellen Roome sitting on chairsBBC

From left to right: Parents Hollie Dance, Lisa Kenevan, Liam Walsh and Ellen Roome

The four British families suing TikTok for the alleged wrongful deaths of their children have accused the tech giant of having “no compassion”.

In an exclusive group interview for BBC One’s Sunday with Laura Kuenssberg, the parents said they were taking the company to court to try to find out the truth about what happened to their children and seek accountability.

The parents believe their children died after taking part in a viral trend that circulated on the video-sharing platform in 2022.

TikTok says it prohibits dangerous content and challenges. It has blocked searches for videos and hashtags related to the particular challenge the children’s parents say is linked to their deaths.

The lawsuit, filed in the US on Thursday, claims that Isaac Kenevan, 13, Archie Battersbee, 12, Julian “Jools” Sweeney, 14, and Maia Walsh, 13, died while attempting the so-called “blackout challenge”.

The complaint was filed in the Superior Court of the State of Delaware by the US-based Social Media Victims Law Center on behalf of Archie’s mother Hollie Dance, Isaac’s mum Lisa Kenevan, Jools’ mother Ellen Roome and Maia’s dad Liam Walsh.

In the interview, Ms Kenevan accused TikTok of breaching “their own rules”. In the lawsuit, the families claim that the platform breached the rules in a number of ways, including around not showing or promoting dangerous content that could cause significant physical harm.

Ms Dance said that the bereaved families were brushed off with “the same corporate statement” showing “no compassion at all – there’s no meaning behind that statement for them”.

Ms Roome has been campaigning for legislation that would allow parents to access the social media accounts of their children if they die. She has been trying to obtain data from TikTok that she thinks could provide clarity around his death.

Ms Kenevan said they were going to court to pursue “accountability – they need to look not just at us, but parents around the world, not just in England, it’s the US and everywhere”.

“We want TikTok to be forthcoming, to help us – why hold back on giving us the data?” Ms Kenevan continued. “How can they sleep at night?”

‘No faith’ in government efforts

Mr Walsh said he had “no faith” that the UK government’s efforts to protect children online would be effective.

The Online Safety Act is coming into force this spring. But Mr Walsh said, “I don’t have faith, and I’m about to find out if I’m right or wrong. Because I don’t think it’s baring its teeth enough. I would be forgiven for having no faith – two and a half years down the road and having no answers.”

Ms Roome said that she was grateful for the support she had from the other bereaved parents. “You do have some days particularly bad – when it’s very difficult to function,” she said.

The families’ lawsuit against TikTok and its parent company ByteDance claims the deaths were “the foreseeable result of ByteDance’s engineered addiction-by-design and programming decisions”, which it says were “aimed at pushing children into maximizing their engagement with TikTok by any means necessary”.

And the lawsuit accuses ByteDance of having “created harmful dependencies in each child” through its design and “flooded them with a seemingly endless stream of harms”.

“These were not harms the children searched for or wanted to see when their use of TikTok began,” it claims.

Searches for videos or hashtags related to the challenge on TikTok are blocked, a policy the company says has been in place since 2020.

TikTok says it prohibits dangerous content or challenges on the platform, and directs those who search for hashtags or videos to its Safety Centre. The company told the BBC it proactively finds and removes 99% of content that breaks its rules before it is reported.

TikTok says it has met with Ellen Roome to discuss her case. It says the law requires it to delete personal data, unless there is a valid request from law enforcement prior to the data being deleted.

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top