A senior executive at TikTok has revealed that data being sought by a group of bereaved parents, who believe their children died attempting a dangerous online challenge, may have been erased due to data protection regulations.
The parents are suing TikTok and its parent company, ByteDance, following the tragic deaths of Isaac Kenevan, Archie Battersbee, Julian “Jools” Sweeney, and Maia Walsh, all aged between 12 and 14. The lawsuit claims that the children lost their lives attempting the “blackout challenge”, a trend in which individuals intentionally deprive themselves of oxygen, with fatal consequences.
Giles Dennington, senior government relations manager at TikTok, spoke about the issue during an interview on BBC Radio 5 Live. He stated: “We always want to do everything we can to give anyone answers on these kinds of issues, but there are some things which we simply don’t have.”
His remarks came on Safer Internet Day, a global initiative aimed at increasing awareness of online harms. Mr Dennington acknowledged that TikTok had been in contact with some of the affected families and recognised that they “have been through something unfathomably tragic.”
Families accuse TikTok of lacking compassion
The grieving parents have accused TikTok of withholding critical data that could provide clarity regarding their children’s final online interactions. In an interview on Sunday with Laura Kuenssberg, they criticised the social media giant for its perceived lack of transparency and compassion.
Ellen Roome, mother of 14-year-old Jools, has been campaigning for legislation that would grant parents access to their deceased child’s social media accounts. She believes TikTok has data that could help explain her son’s death.
Lisa Kenevan, mother of 13-year-old Isaac, questioned the company’s reluctance to provide information. “We want TikTok to be forthcoming, to help us – why hold back on giving us the data?” she said. “How can they sleep at night?”
Legal barriers to data access
Responding to concerns over access to the data, Mr Dennington explained that TikTok is subject to strict data protection laws. “This is really complicated stuff because it relates to the legal requirements around when we remove data. Under data protection laws, we are required to delete certain data quite quickly. That impacts what we can do.”
He added that while such regulations are essential for user privacy, they can also limit what information is available in cases like this. “Everyone expects that when we are required by law to delete some data, we will have deleted it. So this is a more complicated situation than us just having something we’re not giving access to.”
Mr Dennington stressed that the legal case must proceed properly to ensure that all available answers are provided to the grieving families.
Allegations against TikTok
The lawsuit, filed in the United States by the Social Media Victims Law Centre on behalf of the parents, alleges that TikTok violated its own policies by allowing harmful content to circulate on its platform. The parents claim their children were exposed to the “blackout challenge” through TikTok’s algorithm in 2022, despite the platform having rules that prohibit content that promotes dangerous behaviour.
While Mr Dennington declined to comment on the specifics of the lawsuit, he expressed empathy for the families. “I have young kids myself, and I can only imagine how much they want to get answers and understand what happened. We’ve had conversations with some of those parents already to try and help them in that.”
TikTok denies challenge was trending
Mr Dennington also refuted claims that the “blackout challenge” gained traction on TikTok, stating that the trend predated the platform. “We have never found any evidence that the blackout challenge has been trending on the platform. Since 2020, we have completely banned searches for the words ‘blackout challenge’ or any variants of it to ensure that no one comes across that kind of content.”He reiterated that TikTok does not want harmful content on its platform and that its users do not seek such content either.
Safety measures in place
Mr Dennington emphasised that TikTok has invested more than $2 billion (£1.6 billion) into content moderation this year and employs tens of thousands of human moderators worldwide to monitor content. Additionally, TikTok has introduced an online safety hub designed to educate users and foster open discussions between parents and their teenagers about safe social media usage.
“This is a really, really tragic situation,” Mr Dennington concluded. “But we are constantly working to do everything we can to ensure that people remain safe on TikTok.”
The case continues, with the bereaved families determined to uncover the truth and push for legislative changes that grant parents greater access to their children’s digital footprints in the event of their untimely deaths.