TikTok sued by parents whose children killed themselves while participating in a ‘blackout challenge’
from about as useful as thoughts and prayers department
A few months ago, the parents of a 10-year-old child who died of asphyxiation while allegedly ‘participating’ in a ‘blackout challenge’ sued TikTok, alleging their child’s death was directly linked to the social media platform’s moderation efforts (or lack thereof) and content recommendation algorithms. The lawsuit, filed in federal court in Pennsylvania, claimed the death had everything to do with TikTok’s decision to prioritize profits over user safety. And he tried to dodge the inevitable Section 230 issue by claiming it had nothing to do with the third-party content the child viewed and everything to do with TikTok’s handling of third-party content. .
A similar complaint has just been filed by the families of two children who died in similar circumstances.
Eight-year-old Lalani Erika Walton wanted to become “famous on TikTok”. Instead, she ended up dead.
His is one of two such tragedies that prompted a pair of wrongful death lawsuits filed Friday in Los Angeles County Superior Court against the social media giant. The company’s app has powered both Lalani and Arriani Jaileen Arroyo, 9, videos associated with a viral trend called the Blackout Challenge in which participants attempt to choke unconscious, According to the case ; both young girls died after attempting to participate.
Unlike the May trial, this a [PDF] was filed in a California county court. But his allegations are much the same as those in federal court on the other side of the country. Causes of action are defects, negligence, failure to warn, and, particularly in this case, violations of California consumer protection laws.
What’s not discussed at all is CDA Section 230, something that might be a little easier to avoid if the plaintiffs can keep the lawsuit in county court and the judge concentrates on alleged violations of consumer law. But it is a discussion that is almost inevitable.
While the plaintiffs in both cases focus on faulty design, negligence, and other elements allegedly attributable to moderation efforts and TikTok’s content recommendation engine, the inescapable fact is that the acts giving rise to the prosecutions were compelled by content posted by other TikTok users. This is a third-party content issue. TikTok’s algorithms may have played a role in uncovering harmful content, but its algorithms are nothing without a steady stream of user-generated content and entries from TikTok users consuming the generated content.
It is impossible to sue dozens of TikTok users for posting harmful content. Not only that, but it’s a losing strategy: the tragedies behind the lawsuits were the actions of individual users, hard as that is to accept. Suing TikTok only makes a little more sense than trying to hold TikTok users who have created harmful content accountable for self-harm caused by their content. But making a little more sense doesn’t put plaintiffs on the road to victory in court.
TikTok can have a host of content moderation issues. This may mean cutting corners in moderation to ensure maximum profitability. It may have discovered – like so many other platforms – that exponential growth creates content moderation problems that are impossible to solve. And he may very well have promoted content that is harmful to some users – not in the hope that they will harm themselves, but in an effort to expand engagement and retain users. But all this does not constitute legal guilt.
Again, what I said above is not an attempt to blame the victims or their survivors for these tragic deaths. It’s very easy to say that the parents should have been more involved, especially considering the age of the victims here. But children can often be impenetrable black boxes. Sometimes the only way to find out what should has been done is to examine the evidence after the drama has already happened. And while this may provide some pointers for the future, it does not reverse the tragedy or ease the future for families who have lost young children.
Unfortunately, neither do these lawsuits. And it seems unseemly, at best, for law firms to give grieving parents false hope that the justice system can provide some sort of payout, let alone closure, by suing social media platforms for the actions of their users.
Filed Under: kids, moral panic, section 230, tiktok challenges