Addiction By Design: Product Liability and the Case Against Social Media Companies 

When a substance makes someone lose control, the law contemplates treatment and culpability. But when an algorithm makes someone lose control, the law currently faults self-discipline. Roughly 73% of Americans use social media, and up to 83% of American teens aged 13-17 use social media [1]. As litigation against major technology companies looms, courts are beginning to confront whether social media platforms should bear legal responsibility for harms linked to their addictive design. This question is currently being tested in a bellwether case, where plaintiffs allege that these companies intentionally designed their platforms to maximize user engagement through addictive mechanisms, particularly among young users. While litigation is still ongoing for Meta, the owner of Facebook and Instagram, and Google, the owner of YouTube, other companies, including Snapchat and TikTok, have already settled. In this paper, I argue that social media companies indeed should be held liable for social media addiction under the specific principles of product liability, particularly negligent design and failure to warn. The legal doctrine of negligent design relies on proof that the company owner breached their duty of reasonable care, creating unreasonable risks. Failure to warn relies on proving a manufacturer or distributor is liable due to failing to provide adequate instructions or warnings on their products. Much like the tobacco industry before it, social media companies have developed algorithmic features and highly tailored ads deliberately created to maximize engagement, despite risks of use. Evidence from internal communication and platform design strategies suggest that social media companies were aware of such risks, but nevertheless prioritized user growth and retention. Under established product liability doctrines, such conduct provides a plausible basis for legal accountability.

A central claim of liability rests on the targeted, youth-centered design of social media platforms. In the recent litigation against Google and Meta, the plaintiff, a twenty-year-old woman from California referred to as K.G.M, claimed that she had become addicted to social media when she was as young as six years old. K.G.M. argued that social media platforms are intentionally designed to hurt minors for the sake of profit, and that prolonged exposure to social media contributed to her “depression, anxiety, body dysmorphia, self-harm and risk of suicide” [2]. Reports by the European Society of Medicine reveal that “...the social media companies are paid by the brand companies for collecting and delivering all verified user “views” and “clicks” on their ads.”[3] Thus, in order to maximize profits, they must both hold attention and display curated content, which on platforms like TikTok, commonly turns into hours of social media use by individuals [4]. This suggests that social media corporations have implemented these strategies purposefully for monetary gain. According to legal experts, plaintiffs in cases against social media companies must prevail on the failure to warn argument, where “social media companies had a duty to warn [users] about the pitfalls of using social media, failed in that duty, and caused harm as a result” [5]. Plaintiffs could prove causation, a common barrier in this type of litigation, through internal documents showing intentional addictive design, expert testimony on algorithmic reinforcement mechanisms, or epidemiological studies linking social media use to mental health outcomes. If companies knowingly designed systems intended to increase dependency while failing to disclose the psychological risks associated with those systems, a credible claim for negligent design or failure to warn may arise. Internal YouTube memos point to the intentional nature of their addictive design — “the goal is not viewership, it’s viewer addiction” — strengthening plaintiffs’ claims that their resulting injuries were not accidental but the predictable outcome of deliberate product design [6].

Prior litigation further highlights the human consequences of social media design. In a 2022 lawsuit filed against Meta and Snapchat, a mother from Connecticut, Ms. Rodriguez, claimed that her 11-year-old daughter committed suicide as a result of social media use. According to the complaint, “both social media giants ‘knowingly and purposefully’ designed and marketed products that were harmful to a ‘significant’ number of their underage users.” [7] This could satisfy the doctrine of negligent design, as these companies built their design around a lack of reasonable care which, in this case, caused unreasonable risk. As stated in the lawsuit, “[the] defendants intentionally created an attractive nuisance to young children but failed to provide adequate safeguards from the harmful effects they knew were occurring on their wholly owned and controlled digital premises” [8]. In her lawsuit, Ms. Rodriguez cited internal documents and testimonies from former employees that suggested that Meta was aware of the product design and did not make an effort to protect younger children [9]. She connected this to her case by explaining how her daughter’s social media use spurred sleep deprivation, high levels of depression, and poor self-esteem: “In the months leading up to Selena’s suicide, she experienced severe sleep deprivation that was caused and aggravated by her addiction to Instagram and Snapchat, and the constant 24-hour stream of notifications and alerts received.”

Allegations in both cases closely resemble earlier claims brought against the tobacco industry, which profited off the addiction of millions of Americans, highlighting a striking parallel in how both industries allegedly engineered products to maximize dependency despite known risks. Tobacco companies in the late 1900s opted “not to mitigate harms but to instead distract from the overwhelming evidence their product was deadly, they aggressively pushed their product on vulnerable audiences” [10]. Similar to the social media industry, the tobacco industry faced a series of litigation challenging its addictive nature, specifically through the failure to warn doctrine, which culminated in the 1998 Master Settlement Agreement, which saw the four largest tobacco companies in the U.S. settle thousands of lawsuits due to smoking-related illnesses [11]. In 2006, Federal District Court Judge Gladys Kessle found the largest tobacco manufacturers guilty of violating RICO, a law created to dismantle organized crime [12]. Given the parallels between these two industries, it’s possible that by forcing social media companies to settle for harm from social media addiction stemming from failure to warn, further regulations could be placed on these companies.

Despite arguments, critics maintain that social media companies should not be liable for any form of addiction resulting from social media usage. One common defense relies on the First Amendment. Critics commonly assert that social media use can “be understood as expressive activity, and there is no obvious reason to treat this speech differently from scripts or novels or the code that makes videogames work” [13]. However, I argue that the First Amendment in itself protects speech, not conduct that simply involves speech. This distinction is crucial, because algorithmic design is often framed as “speech” to shield it from regulation, all while the product of social media is intentionally built in order to shape user behavior. To explain this phenomenon specifically, in Rumsveld v. FAIR (2006), the Supreme Court held that requiring law schools to host recruiters from the military regulated conduct rather than speech, even though the regulation involved expressive activity [14]. If regulation targets the mechanics of platform design rather than the content of speech itself, First Amendment protections may be less applicable. In the case of social media companies, and more specifically, the maximization of algorithmic engagement, scrolling, notifications, and the general design, protected editorial speech may not apply; such is an issue of negligent design and failure to warn, not protected speech.

Beyond First Amendment appeals, Section 230 of the Communications Decency Act presents one of the most formidable barriers to holding social media companies liable for harms associated with their platforms. Under current law, Section 230 broadly protects online platforms from liability for content created by third-party users. As a result, social media companies have argued that they cannot be held legally responsible for content that appears on their platforms. Social media expert Megan Duncan aptly summarizes this notion, asserting that “social media companies cannot be held liable for the content of what the teens engaged with on social media created by third parties because of section 230 of the Communications Decency Act...” [15].

However, I argue that Section 230 of the Communications Decency Act must be expanded to hold these companies liable, and that it has been misinterpreted in past lawsuits. Currently, it presents a barrier as it grants immunity to online platforms for content generated by a user. The 1996 law in itself is outdated, as it was enacted before social media was created. Back then, internet companies were startups needing support, rather than massive empires needing regulation. In addition to this, Section 230 was drafted in order to address liability for third-party content, as opposed to product design or user experience. Addiction claims based on the algorithms of social media or design, I believe, do not fall directly within its scope. The act in itself states that, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of information provided by another information content provider” [16]. Interpretations under the law, using this rhetoric, have also been far too narrow. An article by the Harvard Law Review argues that, “Obviously wrong interpretations of Section 230, like the Third Circuit’s in Anderson v. TikTok, Inc., only set the law back” [17]. Given how Section 230 has been treated thus far within the scope of social media addiction lawsuits, regardless of whether the entirety of it is applicable or not, it remains clear that the act must be reframed and updated.

Social media companies are intentionally designing against consumers and causing extreme harm — even fatalities — to users and increasing numbers of young children. The industry in itself has an eerily similar structure to that of the tobacco industry, which consumers and advocates prevailed against in the late 1900s to the early 2000s. Using the arguments I have described, not only should social media companies be held accountable for their actions, but they can. The stakes are far from abstract — they can be measured with rising rates of depression, anxiety, and shortened attention spans for young users whose habits are actively being shaped before any form of consent is meaningfully provided. Let us not forget that when engagement and addiction is engineered, immunity becomes a strong license for exploitation. If social media companies can be held accountable, it’s possible that future regulation, technology design, or public health policy can also shift as a result.

Footnotes

[1] Bianes, Gail. 2025. “What Percentage of Americans Use Social Media?” SOAX. 2025. https://soax.com/research/what-percentage-of-americans-use-social-media.; 1 social media and Youth Mental Health: The U.S. Surgeon General’s Advisory. Accessed February 23, 2026. https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-summary.pdf.

[2] News, PBS. 2026. “What Legal Experts Say about a Major ‘Bellwether Trial’ over Child Social Media Addiction.” PBS News. January 28, 2026. https://www.pbs.org/newshour/nation/what-to-know-about-a-trial-that-will-test-tech-giants-liability-for-child-social-media-addiction.

[3] Mujica, Alejandro, Charles Crowell, Michael Villano, and Khutb Uddin. 2022. “ADDICTION by DESIGN: Some Dimensions and Challenges of Excessive Social Media Use.” Medical Research Archives 10 (2): 1–29. https://doi.org/10.18103/mra.v10i2.2677.

[4] Gilbert, Caitlin. 2025. “Spending Hours on TikTok? Here’s How the App Keeps You Swiping.” The Washington Post. October 7, 2025. https://www.washingtonpost.com/wellness/interactive/2025/tiktok-addiction-algorithm-scrolling-mental-health/.

[5] ‌Tamez-Robledo, Nadia. 2026. “Lawsuits Test New Legal Theories about What Causes Social Media Addiction.” EdSurge. February 17, 2026. https://www.edsurge.com/news/2026-02-17-lawsuits-test-new-legal-theories-about-what-causes-social-media-addiction.

[6] Sharp, Sonja. 2026. “Trial in Lawsuit Alleging Harms by Instagram, YouTube Begins in L.A.” Los Angeles Times. February 9, 2026. https://www.latimes.com/california/story/2026-02-09/social-media-harms-trial-instagram-youtube.

[7] BBC News. 2022. “Mother Sues Meta and Snap over Daughter’s Suicide,” January 21, 2022, sec. US & Canada. https://www.bbc.com/news/world-us-canada-60091899.

[8] BBC News. 2022. “Mother Sues Meta and Snap over Daughter’s Suicide,” January 21, 2022, sec. US & Canada. https://www.bbc.com/news/world-us-canada-60091899.

[9] “Selena Rodriguez vs. Meta Platforms, Inc. And Snap, Inc.” n.d. Social Media Victims Law Center. https://socialmediavictims.org/press-releases/rodriguez-vs-meta-platforms-snap-lawsuit/.

[10] Grimes, David Robert. 2025. “Google, X and Facebook Are Modern-Day Tobacco Companies.” Scientific American. April 4, 2025. https://www.scientificamerican.com/article/google-x-and-facebook-are-modern-day-tobacco-companies/.

[11] National Association of Attorneys General. 2024. “The Master Settlement Agreement and Attorneys General.” National Association of Attorneys General. 2024. https://www.naag.org/our-work/naag-center-for-tobacco-and-public-health/the-master-settlement-agreement/.

[12] “25-Year History of the Racketeering Lawsuit against the Tobacco Industry: Guilty of Deceiving the American Public.” n.d. https://www.fightcancer.org/sites/default/files/history_of_doj_rico_lawsuit_fact_sheet_final_11.08.24.pdf.

[13] Chemerinsky, Erwin. “Why Social Media Addiction Lawsuits Should (and Will) Fail.” Stay Tuned with Preet Bharara, February 4, 2026. https://staytuned.substack.com/p/why-social-media-addiction-lawsuits.

[14] “Rumsfeld v. Forum for Academic and Institutional Rights, Inc., 547 U.S. 47 (2006).” n.d. Justia Law. https://supreme.justia.com/cases/federal/us/547/47/.

[15] “Experts Discuss Ramifications of Court Cases Addressing Social Media Addiction in Children.” 2026. Vt.edu. 2026. https://news.vt.edu/articles/2026/02/Meta-YouTube-youth-children-social-media-addiction-trial-case-experts.html.

[16] “47 U.S. Code § 230 - Protection for Private Blocking and Screening of Offensive Material.” Legal Information Institute. Accessed March 1, 2026. https://www.law.cornell.edu/uscode/text/47/230.

[17] Calo, Ryan, Chandler Rankin, J.B. Branch, and Michael F. Duggan. “Courts Should Hold Social Media Accountable - but Not by Ignoring Federal Law.” Harvard Law Review, January 13, 2026. https://harvardlawreview.org/blog/2024/09/courts-should-hold-social-media-accountable-but-not-by-ignoring-federal-law/.

Previous
Previous

Auto-Insurance Rates: Gender Discrimination Flying Under the Radar of the Federal Courts

Next
Next

A Distinction Without a Difference: Adopting a Unified Framework for Euthanasia