The families of mass shooting victims in Uvalde, Texas, and Buffalo, New York, have filed lawsuits attempting to hold Meta and Google liable for violence committed by young men allegedly introduced to military-grade weaponry and hateful ideology during their many hours online. 

The success of these legal actions may turn in part on a federal law that Congress passed in the mid-1990s, Section 230 of the Communications Decency Act, which has become a crucial and controversial tool for powerful social media companies to deflect lawsuits alleging liability for harmful content posted on their platforms.

The two lawsuits are part of a broader effort to grapple with the proliferation of mass shootings in the United States. Another such attempt failed just days ago when the U.S. Supreme Court struck down a ban on bump stocks, devices that can modify semi-automatic guns to fire at faster rates. If they are successful, the suits against social media companies could have consequences beyond this context, providing a template for holding Silicon Valley titans legally responsible for a wider array of societal harms. 

The Rise of Section 230  

As internet commerce was just getting aloft in the mid-1990s, Congress passed Section 230 to protect message boards and other nascent online businesses from being sued over material posted by third parties. Interpreted broadly by the courts, the law has conferred an important advantage on some of the technology industry’s most influential and lucrative corporations, while also promoting free-wheeling expression online.  

Alan Rozenshtein, an associate professor of law at the University of Minnesota who has written extensively about Section 230, said in an interview that: 

In recent years, state and federal courts have begun to entertain arguments that limit Section 230 by framing their claims as falling under product liability or negligence – something that wasn’t happening 10 years ago. This trend means that the strategy employed in the mass shooting lawsuits should be viewed as plausible, if not necessarily guaranteed to produce plaintiffs’ victories.

Heidi Li Feldman, a professor at the Georgetown University Law Center who studies personal injury law, agreed that “this isn’t a wildly crazy position to take on 230.”  

The relevant portion of Section 230 states that providers of an “interactive computer service” – today, that includes social media platforms – shall not be “treated as the publisher or speaker of any information provided by” users of those platforms. 

Section 230 does not provide absolute liability protection. The statute itself makes exceptions for civil claims related to violations of federal criminal law, intellectual property rules, and wiretapping laws, among others. In 2018, Congress added another exception for allegations stemming from online sex trafficking. 

Moreover, courts have allowed plaintiffs to sue interactive websites for their own conduct or speech. Section 230 did not protect, a short-term home-rental service, against liability for violating an ordinance in Santa Monica, Calif., barring unlicensed rentals. Nor can online platforms seek refuge in Section 230 for violating anti-discrimination laws by, for example, soliciting the gender, family status, or sexual orientation of users seeking roommates.

Critics of Section 230 argue that it still stifles legitimate lawsuits. The law “is now poisoning the healthy online ecosystem it once fostered,” House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-WA) and Ranking Member Frank Pallone, Jr. (D-NJ) wrote in a recent op-ed in The Wall Street Journal. “Big Tech companies are exploiting the law to shield them from any responsibility or accountability as their platforms inflict immense harm on Americans, especially children.” 

Bemoaning lawmakers’ failed attempts over the past half-dozen years to curtail Section 230, Rodgers and Pallone have drafted a bill that would repeal the statute altogether in late 2025 unless the industry collaborates on limiting its reach. Or, as they put it,require Big Tech and others to work with Congress over 18 months to evaluate and enact a new legal framework that will allow for free speech and innovation while also encouraging these companies to be good stewards of their platforms.”   

The lawyers filing suit on behalf of families of mass shooting victims aren’t waiting for legislative changes. They argue that Section 230 permits their suits because they are   based on platform design and/or misconduct, rather than the mere display of harmful content.  

Uvalde Victims’ Families Sue Meta 

Last month, the families of 19 fourth-graders and two teachers who were killed in their classrooms in Uvalde in May 2022 sued Instagram. The suit seeks to hold the platform and its corporate parent, Meta, responsible for exposing the 18-year-old gunman to the military-style weapon and tactics he used in the massacre. Filed in state court in California, where Meta is based, the suit also targets Activision Blizzard, the publisher of the violent military-themed video game Call of Duty, and Activision’s parent, Microsoft.

Invoking California negligence and product liability law, the Uvalde families’ suit alleges that Instagram is designed to “addict” teenagers with sensationalistic and violent content that keeps them on the app and makes them vulnerable to the targeted advertising that generates the lion’s share of company revenue. In the months leading up to the attack at Robb Elementary School, the shooter developed “an unhealthy, likely obsessive, relationship with Instagram,” according to the suit. He allegedly created and used at least 20 different Instagram accounts, sometimes opening the app more than 100 times in a single day. Often, he was searching for information about guns – specifically AR-15-style large-capacity military-style semiautomatic rifles of the sort that he eventually used in the massacre.

These elements of the Uvalde families’ claims against Meta are similar to some of the suits filed by dozens of state attorneys general and school districts that have accused the company of addicting and manipulating young users of its platforms.

The Uvalde suit accuses Instagram of purposely undermining its own policy on gun advertising. Meta platforms officially ban ads that “promote the sale or use of weapons, ammunition or explosives,” including firearms. But in practice Instagram’s policy is that “only paid firearm advertisements are prohibited,” according to the suit. This allows gun manufacturers and marketers to post “organic” promotional material from their own accounts and the accounts of pro-gun influencers, as long as the material avoids the words “buy” and “sell,” omits prices, and does not link directly to purchasing opportunities. By creating this loophole in its anti-gun ad policy, Instagram effectively provides the firearm industry with a “blueprint” for promoting their wares to impressionable users like the Uvalde shooter, the suit alleges.

Instagram’s effectiveness for marketing guns is openly discussed in gun industry circles. The suit cites the marketing agency Fidelitas, which carries on its website an article entitled, “5 Ways Firearm Brands Can Advertise Online Without Google Ads and Facebook Ads.” The article notes that “there are some major loopholes in…advertising regulations for Facebook and Instagram.” Fidelitas emphasizes that gun manufacturers’ own “organic posts” and gun “reviews” by influencers are allowed, as long as they do not link to pages where guns are sold.

In the run-up to the Robb Elementary attack, the shooter became particularly interested in AR-15-style rifles manufactured by a Georgia  company called Daniel Defense, according to the suit. Daniel Defense frequently used its own account to post about its weapons on Instagram. Often, the company showed its rifles in the hands of soldiers. But on May 13, 2022, a Daniel Instagram post depicted an AR-15-style rifle leaning against the refrigerator in a kitchen, with the text, “Let’s normalize kitchen Daniels. What Daniel do you use to protect your family and home?” Three days later, the Uvalde shooter purchased the exact model shown in the kitchen scene – the DDM4 V7 – and eight days after that, he used it at Robb Elementary, according to the suit. (In a separate suit filed in Texas state court, the Uvalde plaintiffs have accused Daniel Defense of illegally enticing the shooter to purchase the weapon less than an hour after he turned 18.)

The plaintiffs’ complaint implicitly concedes that at this early stage of the California litigation, they don’t yet have evidence proving that specific Instagram posts brought the DDM4 V7 to the shooter’s attention. In fact, the suit alleges that he was also exposed to that model while “obsessively” playing Modern Warfare, an installment in the popular Call of Duty video game franchise marketed by Activision. Daniel Defense boasted on Instagram that the DDM4 V7 was featured on the Modern Warfare loading page, according to the suit. Daniel, Instagram, and Activision each participated in “grooming” the teenager as a mass killer, the plaintiffs allege.

Meta and Daniel did not respond to requests for comment for this article. An Activision spokesperson told CBS News, “We express our deepest sympathies to the families,” but added that “millions of people around the world enjoy video games without turning to horrific acts.”

A lawsuit Against YouTube Stays on Track 

A ruling in March from a state court judge in New York provided at least some support for the strategy of emphasizing platform design and conduct when suing a social media company. The New York judge refused to dismiss a lawsuit against YouTube and its corporate parent, Google, stemming from the May 14, 2022, mass shooting at a Buffalo supermarket, which occurred just 10 days before the Uvalde massacre. 

The New York judge rejected, for now, the companies’ Section 230 defense and said relatives of the victims could move ahead with pre-trial discovery – the process of seeking documents and other information – that may establish that YouTube is a defectively designed “product” under New York tort law. The plaintiffs in that case allege that the shooter targeted African-American customers at Tops Friendly Markets after being indoctrinated on YouTube to believe a strain of racist ideology known as the “white replacement theory.” 

The false white replacement theory, which is also known as the great replacement theory and is prevalent in some right-wing circles, posits a sweeping plot to have immigrants and people of color eclipse whites politically and socially. The New York ruling is not a binding precedent in California, and the case is still at a relatively early stage, but it may provide a preview of pretrial skirmishing likely to occur in the Uvalde litigation.

In the end, courts may conclude that the suits in California and New York are, at root, about third-party content, triggering Section 230’s liability shield. But it’s worth noting that the lead plaintiffs’ lawyer representing the Uvalde families, Josh Koskoff, is the same attorney who has successfully represented victims’ families in litigation following the Newtown, Conn., elementary school shooting in 2012. That case, which did not involve allegations about social media, targeted Remington Arms, the manufacturer of another AR-15-style rifle. In steering the Newtown case, Koskoff demonstrated a knack for using claims of improper marketing to avoid a broad federal liability shield law designed to protect an industry – in that case, the gun industry. 

The Newtown suit initially was seen as a long shot because of the Protection of Lawful Commerce in Arms Act, which could be considered firearm manufacturers’ version of Section 230. Enacted in 2005 in response to lawsuits against gun makers, the Act protects the firearm industry from almost all civil liability for shootings. But Koskoff argued that the Newtown case should be allowed to proceed based on a Connecticut consumer protection law, which he claimed bars wrongful marketing by Remington Arms. In 2022, after surviving attempts to have the suit dismissed, he reached a landmark $73 million settlement with Remington, and created a template for other litigation now pending against the gun industry.

Much legal maneuvering lies ahead. Whether Koskoff’s breakthrough in the Newtown case puts pressure on Daniel Defense to settle the Uvalde litigation is one question. Even more complicated is the question of whether he and his counterparts in the Buffalo case can circumvent Section 230 in their pursuit of Meta and Google. 

The ripple effects of such a breakthrough could propel other efforts to hold the social media industry accountable by means of litigation. Perhaps the most immediate effects could be seen in pending lawsuits blaming the industry for platform designs that allegedly addict teenagers and then facilitate their exposure to content that exacerbates depressive and even suicidal tendencies.

Broad-based corporate reform litigation typically gains traction when supported by changing popular attitudes, as occurred in connection with lawsuits that targeted the tobacco industry in the 1980s and 1990s and the opioid industry in the 2010s. U.S. Surgeon General Vivek Murthy alluded to attempts to curb the tobacco industry in his recent call for warning labels related to teen mental health to be applied to social media sites. Murthy’s recommendation that Congress impose such labels followed President Joe Biden’s reference to the issue in his State of the Union Address in February. “We must finally hold social media companies accountable for the experiment they are running on our children for profit,” the president said. 

In coming months and years, we will learn whether lawsuits over mass shootings may provide an avenue for achieving that accountability.

IMAGE: A memorial dedicated to the 19 children and two adults murdered on May 24, 2022, during the mass shooting at Robb Elementary School in Uvalde, Texas, is seen on May 24, 2023. (Photo by Brandon Bell via Getty Images)