Meta Platforms Inc (formerly Facebook): Seeking a report on risks related to company failures in content governance

<- Back to Resolution Tracker

WHEREAS:  The Meta (formerly Facebook) brand has continued to be wracked by management missteps and lack of Board oversight, resulting in continued harm by its platform including:

  • Millions of high-profile users exempted from its rules,[1] permitting continued widespread; incitement of violence and harrassment;

  • Internal Company research demonstrating that Instagram is toxic for teen girls;[2]

  • Mental health crises among outsourced moderators[3] due to viewing child pornography and animal cruelty;

  • Lack of cooperation with authorities to prevent and detect child exploitation and abuse;[4]

  • Ignored employee red flags about the spread of election misinformation;[5]

  • Political advertisements containing deliberate lies and mistruths;[6]

  • Hate speech that continues to thrive;[7]

  • Anti-immigrant violence[8] around the world.

A whistleblower complaint filed with the SEC[9] argues that the Company has failed to adequately warn investors about the material risks of dangerous and criminal behavior, terrorist content, hate speech, and misinformation on its sites. Company failure to control these activities reflects a grave lack of oversight by management and the board. Despite establishing an internal Oversight Board, the Company’s platforms continue to harm society and create investor risk. An internal review of company practices highlighting harassment and incitement to violence states,[10] “We are not actually doing what we say we do publicly,” and deems company’s actions “a breach of trust.”

Management has attempted to address the material risk of dangerous user content through the creation of the “Transparency Center”[11] that displays qualitative and quantitative reports on the elimination of posts that violate the 25 “Community Standards.” Shareholders applaud this action, yet ask why this seemingly robust technological and human-screening system is ineffective?

RESOLVED:  Shareholders request the Board, at reasonable expense and excluding proprietary or legally privileged information, prepare a report analyzing why the enforcement of “Community Standards” as described in the “Transparency Center” has proven ineffective at controlling the dissemination of user content that contains or promotes hate speech, disinformation, or content that incites violence and/or harm to public health or personal safety.

SUPPORTING STATEMENT: Proponent suggests the report include, in Board and management discretion:

  • A quantitative and qualitative assessment by an external, independent panel of qualified computer scientists of the effectiveness of Meta’s algorithms to locate and eliminate content that violates the Community Standards

  • An assessment of the effectiveness of Meta’s staff and contractors in locating and eliminating content that violates the Community Standards

  • An examination of benefits to users and impact to revenue if the Company would voluntarily follow existing legal frameworks established for broadcast networks (e.g. laws forbidding child pornography and rules governing political ads)

  • An analysis of the benefits of the Company continuing to conduct technology impact assessments focused on how Meta’s platforms affect society.

This report should cover each of Meta’s major products, including Facebook, Messenger, Instagram, WhatsApp, and any other app that reaches over 100 million users.

[1] https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353

[2] https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739

[3] https://www.nytimes.com/2021/08/31/technology/facebook-accenture-content-moderation.html

[4] https://www.theguardian.com/technology/2021/jan/21/facebook-admits-encryption-will-harm-efforts-to-prevent-child-exploitation

[5] https://www.nytimes.com/2021/10/22/technology/facebook-election-misinformation.html?referringSource=articleShare

[6] https://www.washingtonpost.com/technology/2019/10/10/facebook-policy-political-speech-lets-politicians-lie-ads/

[7] https://www.dailydot.com/debug/hate-speech-facebook/

[8] https://www.dw.com/en/new-study-shows-afd-facebook-posts-spur-anti-refugee-attacks/a-41972992

[9] https://www.washingtonpost.com/technology/2021/10/22/facebook-new-whistleblower-complaint/

[10] https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353

[11] https://transparency.fb.com/

Resolution Details

Company: Meta Platforms (formerly Facebook)

Lead Filers:
As You Sow

Year: 2022

Filing Date: 
December 2021

Initiative(s): Media Content

Status: 19.2% overall vote, (63.1% of independent shareholder votes)

Download PDF

SEC No-Action Win

Proxy Memo