The TikTok Challenge: Section 230 of the Communication Law is Getting X’d

Social media is about 27 years old. Since its creation, it has developed almost demonic abilities to addict the young, luring them into harm by cyberbullying, enticing them to self-immolate via tantalizing and dangerous dares, provocative challenges, or other forms of psychological manipulation. If any of us had acted this way, we would be aptly labeled psychopathic – and likely prosecuted. Until recently, however, legal challenges against the Satans of Cyberspace have mainly been stymied. Last week, the Third Circuit inflicted another chink in their armor, Section 230 of the Communication Law.
Generated by AI

Section 230 of the Communications Decency Act is probably anything but decent. The clause insulates social media (SM) from liability when hosting third-party content – even if dangerous or offensive - under a neutral “publisher” guise. But a surfeit of teenage and pre-teenage deaths accruing from embellishments and come-ons conjured by SM hosts is chipping away at this legislative armor. Maybe, though, not soon enough.

Tawainna Anderson brought the latest case on behalf of her daughter, Nylah.  Nylah had tuned into a video called “Blackout Challenge” on her uniquely curated TikTok “For You” page  (FYP). This “child-friendly” entertainment encouraged viewers to record themselves engaging in acts of self-asphyxiation. Entranced, Nylah partook. And, yes, she unintentionally hung herself. Nylah was ten.

Several cases are pending against social media- claiming not that the content per se was the causal harm but that the innovative algorithms, addicting, grooming (by encouraging likes and reposts), and other SM come-ons subject the host to liability. The thrust of these cases is the creation of an addiction, but the crux of any successful legal action against SM is the (hyping) conduct provided by the host itself. To leapfrog the legislative barricades, the plaintiff must demonstrate the many tweaks platforms introduce that mediate/facilitate/encourage user-content interaction. 

In the Anderson suit, addiction is not the precipitating cause of her daughter’s death. Instead, Ms. Anderson brings claims for negligence and product liability, but the activities involved are not direct: TikTok is being sued for creating a vehicle that provided a video inducing her child into performing fatal activities. Indeed, the means of Nylah’s death are horrendous. As described by the Court:

“Some videos that may appear on users’ FYPs are known as “challenges,” which urge users to post videos of themselves replicating the conduct depicted in the videos. The “Blackout Challenge . . . encourages users to choke themselves with belts, purse strings, or anything similar until passing out…… TikTok’s FYP algorithm recommended a Blackout Challenge video to Nylah, and after watching it, Nylah attempted to replicate what she saw and died of asphyxiation.

The videos “encourage[d]” viewers to record themselves … and post their videos for other TikTok users to watch. Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her. But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her “For You Page” after it “determined that the Blackout Challenge was ‘tailored’ and ‘likely to be of interest’ to Nylah.” 

TikTok's defense reads, 

“§ 230 of the Communications Decency Act …to permit casual indifference [CD1] [BB2]  to the death of a ten-year-old girl.” The Court strongly objects, noting this  position has become popular among a host of purveyors of pornography, self-mutilation, and exploitation, smuggling “constitutional conceptions of a ‘free trade in ideas’ into a digital ‘cauldron of illicit loves’ that leap and boil with no oversight, no accountability, no remedy.” 

The Court laments the ever-increasing use of the supposed defense. 

Notwithstanding, the same approach is being used to defend SM hosts in the addiction cases brought by various states and school districts, relying on., how the message was presented (recommended) as an “expressive” product to the user- to escape liability, “ Claiming that the message “communicates to users . . . [a] curated stream of videos [which] will be interesting to them” and is therefore protected not only under Section 230 but under the First Amendment. Those cases are still pending, but the track record of victories isn’t great. 

Even the Anderson Court notes various loopholes: “traditional functions of platforms including recommendations and notifications, arranging or displaying content – even via algorithms is not enough to hold [a defendant platform] responsible as the developer or creator of that content.” The Court goes further: “Had Nylah viewed a Blackout Challenge video through TikTok’s search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content. “

However, a few cases eviscerating 230’s immunity exist, uniformly presenting horrific harms. 2017 saw the first crack in SM immunity. 

  • In Lemon v. Snap, three teenage plaintiffs died as they filmed their “fiery” high-speed “car chase” via a Snapchat Speed Filter, speeding down a Wisconsin road at 100 mph. The parents sued for negligent design, claiming the interplay between Snapchat’s Speed Filter and its reward system of “trophies, streaks, and social recognitions” represented a design defect that encouraged dangerous high-speed driving, thereby causing their death.” They won -but it took till 2021 for Snapchat to remove its Speed Filter.
  • In 2020, 19-year-old Devin Norring bought a lethal overdose of fentanyl on SnapChat, believing it was Percocet to alleviate pain. 
  • In 2023, 14-year-old Adriana Kuch committed suicide after intense cyberbullying.

However, the instigation here is more insidious than creating a climate of chronic addiction – it is the creation of a particular and unique algorithm that enticed Nylah to watch and copy a death sentence. Brushing away TikTok’s attempts to situate its activity within the broad protection of 230 and relying on Supreme Court dicta, the Court held that a claim against them could stand, finding that:

“TikTok’s algorithm [1] is not based solely on a user’s online inputs. Rather, the algorithm curates and recommends a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata.”  

The First Amendment rights of the platforms are swiftly disposed of, says the Court, as indubitably First Amendment rights are limited - where imminent harm is at bay.

The issue before the Court is not whether Tik Tok will be held liable for Nylah’s death, only whether the case can proceed, which the Court ruled in the affirmative – at least in part. I would venture to guess that the product liability claim will fail. [2]  However, the negligence claim stands a good chance of prevailing – if that is, the Supreme Court doesn’t first intercede and strike down the 3rd Circuit decision, as some strong supporters of SM protection argue. Supreme Court intervention seems likely unless Congress intervenes first.

Should the case be allowed to stand, TikTok’s awareness of the absolute dangers of their challenge should haunt them, as this is not the first devilish death from the Blackout Challenge. In 2019, 12-year-old Matthew Minor also suffocated while attempting the TikTok “blackout challenge,” with participants competing for who might cut off their brain’s oxygen first. Nylah’s mother claims that TikTok was aware of these and similar deaths and allowed users to post videos of themselves participating in the challenge. 

Indeed, as we know from the horrific Pacqui Challenge, this type of come-on proves especially alluring to children. Its adoption by social media, with its overriding ethos of cultivating addiction, is not surprising: Once Courts can circumvent the liability protections afforded to SM by 230, these cases will turn on conventional negligence theories, with a side benefit of creating deterrence by imposing large punitive damage verdicts. We have a way to go before this happens, but the courts seem to be on the right trajectory. As the Third Circuit summarizes:

“Section 230 only immunizes publishers or speakers for the content of the information from other providers that they make public. The CDA says nothing about immunizing publishers or speakers for their own conduct.”

 

[1] Defined by the Court as “a set of digital instructions that perform a task.”         

[2] Product liability claims affix only to products, not to services. While it can certainly be argued that the social utility of an absolutely dangerous commodity should be actionable, TikTok’s actions are more akin to providing a service rather than a product.

Category