
Left: Nylah Anderson (Anderson v. TikTok). Right: Tawainna Anderson (WPVI/YouTube).
The mother of a 10-year-old Pennsylvania girl who died after participating in a TikTok challenge may now pursue her lawsuit against the social media company.
As Law&Crime previously reported, Nylah Anderson died in December 2021 after apparently attempting the viral “Blackout Challenge” and asphyxiating herself. As described in a lawsuit against TikTok and parent company ByteDance, the “Blackout Challenge … encourages children to choke themselves until passing out.”
Nylah’s mother, Tawainna Anderson, found her daughter hanging from a purse strap in a bedroom. The girl lingered for days but ultimately died while receiving treatment in a pediatric intensive care unit. Anderson sued the social media company, alleging that the “app and algorithm are intentionally designed to maximize user engagement and dependence and powerfully encourage children to engage in a repetitive and dopamine-driven feedback loop by watching, sharing, and attempting viral challenges and other videos.”
“TikTok is programming children for the sake of corporate profits and promoting addiction,” the complaint said.
In October 2022, the case was dismissed by a federal judge who concluded that Section 230 of the Communications Decency Act of 1996 — which protects content distributors’ liability for third-party communications — barred Anderson’s claims.
“Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it,” U.S. District Judge Paul S. Diamond, a George W. Bush appointee, wrote at the time. “In thus promoting the work of others, Defendants published that work — exactly the activity Section 230 shields from liability.”
On Tuesday, a federal appeals court reversed that decision, finding that — in light of a recent Supreme Court decision acknowledging that social media networks have a First Amendment right to engage in “expressive activity” — TikTok may ultimately face liability for Nylah Anderson’s death.
“Section 230 immunizes only information ‘provided by another[,]’ and here, because the information that forms the basis of Anderson’s lawsuit — i.e., TikTok’s recommendations via its FYP algorithm — is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims,” U.S. Circuit Judge Judge Patty Shwartz, a Barack Obama appointee, wrote for the three-judge panel.
The ruling relied heavily on the Supreme Court’s unanimous July decision in a pair of cases over efforts by Texas and Florida to prevent what state lawmakers called “censorship” of conservative perspectives on social media. All nine justices found that the statutes were facially unconstitutional.
On Tuesday, the appeals court relied on this framework in concluding that Anderson’s case should continue.
“In Moody v. NetChoice, LLC, the Court considered whether state laws that ‘restrict the ability of social media platforms to control whether and how third-party posts are presented to other users’ run afoul of the First Amendment,” Shwartz wrote (citations omitted). “The Court held that a platform’s algorithm that reflects ‘editorial judgments’ about ‘compiling the third-party speech it wants in the way it wants’ is the platform’s own ‘expressive product’ and is therefore protected by the First Amendment. Given the Supreme Court’s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms, it follows that doing so amounts to first-party speech under § 230, too.”
The judges drew a distinction between the TikTok-generated algorithm and third-party users that Nylah Anderson might have sought out.
“We reach this conclusion specifically because TikTok’s promotion of a Blackout Challenge video on Nylah’s FYP was not contingent upon any specific user input,” the decision says. “Had Nylah viewed a Blackout Challenge video through TikTok’s search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content.”
The court didn’t address the distinction between TikTok’s role as either a publisher or distributor for purposes of Section 230 because “in this case, the only distribution at issue is that which occurred via TikTok’s algorithm, which as explained herein, is not immunized by § 230 because the algorithm is TikTok’s own expressive activity.”
The judges acknowledged that any of Anderson’s claims “not premised upon TikTok’s algorithm” may, in fact, be barred by Section 230, but it would be for the lower court to decide.
U.S. Circuit Judge Paul Brian Matey, a Donald Trump appointee, wrote a partial concurrence and partial dissent, raging against what he said was TikTok’s “casual indifference” to Nylah Anderson’s death, and arguing that “Anderson’s estate may seek relief for TikTok’s knowing distribution and targeted recommendation of videos it knew could be harmful.”
Matey said that Section 230 jurisprudence had become “nearly-limitless” in its protections of content distribution platforms.
“It is a position that has become popular among a host of purveyors of pornography, self-mutilation, and exploitation, one that smuggles constitutional conceptions of a ‘free trade in ideas’ into a digital ‘cauldron of illicit loves’ that leap and boil with no oversight, no accountability, no remedy … But it is not found in the words Congress wrote in § 230,” he wrote.
Section 230, he wrote, “rides in to rescue corporations from virtually any claim loosely related to content posted by a third party, no matter the cause of action and whatever the provider’s actions. The result is a § 230 that immunizes platforms from the consequences of their own conduct and permits platforms to ignore the ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm.”
In Matey’s view, the time for accountability has come.
“The marketplace of ideas, such as it now is, may reward TikTok’s pursuit of profit above all other values. The company may decide to curate the content it serves up to children to emphasize the lowest virtues, the basest tastes. It may decline to use a common good to advance the common good,” he wrote. “But it cannot claim immunity that Congress did not provide.”
Have a tip we should know? [email protected]