Jury Verdicts Against Meta Validate Longstanding Investor Concerns on Child Safety, Trafficking, and Platform Harm

FOR IMMEDIATE RELEASE

MEDIA CONTACT: Ryon Harms, [email protected], (310) 730-9407 

EL CERRITO, CA — March 26, 2026 — Two landmark jury verdicts this week against Meta Platforms, Inc. have confirmed what shareholder advocates and impacted users have warned for nearly a decade: the company’s failure to address harmful content and platform design risks has created profound legal, financial, and societal consequences.

This week, a jury in New Mexico found that Meta’s platforms facilitated the grooming and harm of minors, while a separate jury in Los Angeles determined that Meta’s products are addictive and have caused widespread harm to children and adolescents. These decisions underscore escalating litigation risk and raise urgent questions about corporate governance, fiduciary oversight, and long-term shareholder value.

For the past decade, shareholders have repeatedly raised concerns about these exact risks through direct engagement and formal proposals filed on the company’s annual proxy.  Advocacy organizations including Proxy Impact, As You Sow, many faith-based investors, and pension funds have filed shareholder resolutions addressing hate speech, child exploitation, sex trafficking, harmful algorithms, and disinformation across Meta’s platforms.

Notably, In support of its shareholder resolution, Proxy Impact brought forward testimony at a Meta annual general meeting from a survivor of sex trafficking that was facilitated through Facebook where she met a man who kept her imprisoned and prostituted her. This directly illustrates the real-world harms tied to platform failures. Despite these warnings, Meta leadership consistently resisted the meaningful reforms flagged by high shareholder votes seeking changes to address these harms.

“Shareholders identified these risks years ago—not as abstract concerns, but as material threats to users, society, and the company,” said Michael Passoff, CEO of Proxy Impact. “This week’s verdicts make clear that ignoring those warnings has now translated into legal and financial liability.”

Meta has long relied on Section 230 of the Communications Decency Act to argue that it bears no responsibility for user-generated content. However, internal evidence and external reporting have shown that the company chose to not fully implement known technical solutions to detect, remove, and prevent harmful content—and to ban repeat offenders as well as end end-to-end encryption designed to protect these perpetrators.

Critics argue that the company’s failure to act reflects governance choices, not technological limitations. The fact that CEO Mark Zuckerberg has a 10:1 voting preference and therefore virtually no oversight puts the blame squarely on his shoulders. The trials showed that Zuckerberg prioritized platform growth and user engagement over enforcement, even in cases involving child exploitation.

The shareholder record demonstrates a consistent pattern of concern, with multiple votes earning between 40% and 60% of independent shareholder votes representing hundreds of billions in value:

- A 2024 proposal requested targets including quantitative metrics appropriate to assessing whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platform.

- A 2018 proposal on content governance risks warned of Meta’s systemic failures in monitoring and enforcement.

- A 2021 proposal called for a report on risks tied to content governance failures that exposed Meta to litigation and regulatory action.

- Ongoing proposals have addressed harms related to children, hate speech, and disinformation.

Despite these efforts, Meta’s dual-class share structure has muted the impact of independent shareholder voices. With a 10:1 voting advantage, Zuckerberg retains effective control over corporate decisions with no oversight from his own board or his investors.

“This is exactly what happens when corporations have no real shareholder oversight,” said Andrew Behar, As You Sow’s CEO. “One person, due to a stock-class preference, decided that it was okay to harm children and a generation has suffered. This is going to be like tobacco and opioids combined even though shareholders warned of this risk for a decade. And right now, the SEC is attempting to dismantle shareholder power so more CEOs have no oversight.”

In 2022, a proposal filed by As You Sow received 63% support from independent shareholders, yet only received 19% official shareholder support due to the preference-weighted voting structure. This disparity highlights systemic governance concerns and raises broader questions about whether current proxy voting frameworks accurately reflect investor sentiment.

“These votes are not fringe—they represent a majority of independent capital,” Behar added. “When governance structures obscure that reality, they distort accountability.”

The recent verdicts may mark an inflection point. Juries made up of regular citizens have called Meta to account for its putting profits over safety. As litigation risk materializes and regulatory scrutiny intensifies, investors are increasingly likely to reassess exposure to companies with unresolved governance and oversight failures.

“Shareholder proposals function as an early warning system,” Passoff concluded. “In Meta’s case, that alarm was sounding for years. The cost of ignoring it is now clear.”

# # #

As You Sow is the nation’s leading shareholder representative, with a 30+ year track record promoting environmental and social corporate responsibility. As You Sow addresses a range of issues that affect shareholder value including climate change, ocean plastics, toxins in the food system, biodiversity, racial justice, and workplace diversity. See As You Sow’s shareholder resolution tracker.