On October 8, 13 states and the District of Columbia sued TikTok, alleging that the social media company’s algorithm is designed to “promote excessive, compulsive, and addictive use” in children. While each state’s complaint was filed separately in state court, the cases are coordinated around the claim that TikTok’s design is deliberately addictive, exploiting kids’ dopamine reward circuitry to reinforce their use of the platform
These claims stem from a public reckoning of the effects of social media on children. New research has also led the surgeon general to announce a mental health crisis among young people. The lawsuits, arising from the desire to hold platforms accountable for exploiting children’s susceptibility to rewarding stimuli during development, present a novel theory of liability based solely on an algorithm’s ability to cause addiction rather than adverse mental health outcomes. Holding TikTok liable could lead to major changes in social media algorithms, reducing mental health harm.
While the crisis among children is driving numerous cases against many social media companies, TikTok’s algorithm is uniquely poised for litigation. States allege that features like “beauty filters,” “autoplay,” and “challenges” are purposeful designs meant to cause addictive engagement in children. With beauty filters, users can dramatically alter their appearance before posting a video, perpetuating negative self-obsession or self-hatred. When users open the app, TikTok’s autoplay–which cannot be disabled–automatically begins playing a video, reducing the time it takes users to perform the act of clicking start and keeping them on the platform for longer. Additionally, challenges to complete dangerous or illegal activities proliferate on the platform, including the ““blackout” or “choking challenge,” which has been linked to numerous deaths.
The History of Addiction Litigation: Tobacco
The current lawsuits against TikTok are like previous lawsuits against the tobacco industry. From the 1950s to the 1980s, cases against cigarette manufacturers were unable to prove that tobacco was the direct or proximate cause of cancer. Then, in 1992, Cipollone v. Liggett Group, Inc. plaintiffs were able to prove that the cigarette manufacturers knew, but failed to warn smokers, that smoking cigarettes is addictive.
After Cipollone, a wave of cases against cigarette manufacturers were even more successful because internal documents were leaked showing the companies knew the addictive nature of tobacco. Industry knowledge of wrongdoing was further corroborated after the U.S. Food and Drug Administration’s 1994 investigation, which found that manufacturers were controlling nicotine levels to ensure users became addicted. Documents and testimony about nicotine’s addictive qualities provided plaintiffs evidence to prove that smoking cigarettes was both the direct and proximate cause of addiction and that companies failed to warn consumers of the risks.
The Future of Addiction Litigation: TikTok
Just as plaintiffs were successful in their addiction claims against cigarette manufacturers, states may find success in their addiction claims against TikTok. The complaints cite numerous scientific studies, including one assessing social media addiction compared to substance use addiction like heroin. These studies support arguments that TikTok consciously exploited the neurotransmitter dopamine through the algorithm’s design. Numerous citations to medical literature corroborate the idea that children are especially susceptible to dopamine “rewards” that lead to habit formation, including addictive behaviors. The data show children are especially vulnerable during the development of the prefrontal cortex in the brain, which governs higher reasoning, goal setting, and impulse control.
Just as cigarette manufacturers misrepresented the addictive risks of nicotine, states argue that TikTok misrepresents the addictive risks of its platform. Specifically, they urge that despite documented knowledge that TikTok knows its platform wreaks havoc on mental health, TikTok overstates the effectiveness of features designed to combat addictive use in children.
Just as cigarette manufacturers deliberately maximized nicotine doses to consumers with knowledge that the substance elicited dangerous addictive effects, states claim that TikTok knew or should have known that there was a safer platform design that would have mitigated foreseeable addiction. Moreover, they argue that since TikTok’s business model is to increase revenue by maximizing engagement, the platform’s addictive nature is deliberate and designed defectively under products liability.
And just as cigarette manufacturers failed to warn about the addictive nature of nicotine, states argue that TikTok has failed to provide adequate warnings to inform users of the risks. They urge that TikTok is dangerous beyond what a child would think when downloading an app because it encourages unhealthy and addictive engagement.
Will the Lawsuits be Successful?
Since the core of these claims relies on addiction, the most significant challenge for states will be proving causation. States first need to prove TikTok’s algorithm can cause addiction (known as the direct cause). Even if the courts agree that the scientific studies prove that TikTok’s algorithm can cause addiction, states still need to prove that TikTok’s algorithm caused actual addiction (that the platform was also the proximate cause).
Proving causation is more promising in the TikTok cases than in other unresolved cases against various social media companies. For example, in an ongoing multi-district litigation, plaintiffs argue that platforms such as Meta and Snapchat caused addiction in children, which caused self-harm actions like suicide and eating disorders, Proving proximate causation in these cases is difficult, since addiction to social media platforms might be one of several contributing causes of self-harm. Instead, the claims against TikTok only allege that the harm is the addiction itself rather than a physical action that results from the addiction. This theory only requires that states prove the platform causes the direct and proximate cause of addiction, not any harms thereafter.
Finally, Section 230, which limits liability of online service providers from third party content, will likely not be a shield for TikTok. Section 230 has successfully shielded social media companies from liability related to user-generated content. For example, the court ruled in the MDL that social media companies would not be liable for videos featured by the platform’s algorithm. However, the claims here are narrower. Since states urge that the platform design is itself defective without any user input, TikTok will not be able to invoke Section 230 against its own programming choices.
Lawsuits seeking to hold social media companies liable for the harm caused by their own algorithms are unique and unprecedented. Ample evidence suggests that TikTok’s business strategy exploits the development of dopamine rewards in children to induce addictive engagement with their platform. Holding TikTok liable for addicting a generation to their app begins the process of remedying the mental health crisis in children.
Jessica Samuels is a third-year dual degree law and public health student (J.D./MPH 2025). Her research interests include genetics, environmental health sciences, novel biotechnologies, and the FDA regulatory process. She has previously published work on the accuracy of ultrasound in predicting malignant ovarian masses. At HLS, Jessica is co-president of the Harvard Health Law Society.