At the latest annual meeting of Facebook (now Meta), activists placed proposals on the ballot seemingly designed to push Facebook into banning more content.
One came from As You Sow which is probably the best known left-of-center proxy activist group. The proposal concerns Facebook’s “community standards” which is what the company uses to justify banning content. It calls upon the company to do an analysis as to why the current enforcement of community standards “has proven ineffective at controlling the dissemination of user content that contains or promotes hate speech, disinformation, or content that incites violence and/or harm to public health or personal safety.” (Welcome to META PLATFORMS, INC.’s Virtual Annual Shareholder’s Meeting (virtualshareholdermeeting.com).
These phrases are, of course, highly interpretable, and have been used in biased ways. The vagueness of what does and does not constitute “hate speech” or “disinformation” allows the company enormous latitude when it comes to imposing its own political and cultural agenda. Alliance Defending Freedom has specifically documented the problems with Facebook’s policies in this area in its recently released Viewpoint Diversity Score. In the “Market Score” section, the one which deals with issues pertaining specifically to the company’s treatment of free speech issues with users and customers, Facebook’s score is only 3% out of a possible 100, and some of the specific red flags are precisely in the area where As You Sow is implying should be further restricted:
The As You Sow proposal even calls for the company to launch “An examination of benefits to users and impact to revenue if the Company would voluntarily follow existing legal frameworks established for broadcast networks (e.g. laws forbidding child pornography and rules governing political ads).” In other words, the same sort of rigorous enforcement which is used against broadcasting child pornography is proposed to fight “hate speech” and “disinformation.”
The origin of the proposal is consistent with the clear political agenda behind it. It assumes that Meta’s past policies have been insufficiently rigorous in banning alleged disinformation on the platform. Furthermore, it focuses only on the risks of too little self-censorship, and not at all on the risk of too much self-censorship. It has become clear from recent events in and around Twitter that overly strict banning policies are likewise a source of reputational risk.
The management of the company opposed this proposal and shareholders rejected it, which might indicate that both companies and investors have reached a point where they see that activists have pushed too far. Generally, proposals appear on the ballot only after negotiations with the activists have broken down, and As You Sow has a history or proposing ballot questions and then negotiating with the company to get most of what they want, and then withdrawing the question.
Remarkably, there was another proposal put forward by Australian Union which focused specifically on “climate denial, vaccine hesitancy, and girls’ body image.” The proposal called upon Facebook to prioritize social goals over profits, but assumes that suppressing “climate denial” and “vaccine hesitancy” will be of social good. The very premise of social media platforms, including Facebook, has historically been that debating issues, not suppressing debate, is socially beneficial.
Facebook opposed this proposal, and it was defeated. There are reasons to believe that cancel culture coming from shareholder pressure groups might have reached its limit.