Meta’s Oversight Board, an impartial body the social media behemoth chose to make content choices, produced a study supporting Meta’s content filtering during the COVID-19 pandemic. The board also recommends other disinformation policy adjustments and criticizes the company’s failure to analyze its social media platforms’ effects on public health and human rights.
The board wants Meta to study how design features like Facebook’s News Feed recommendation algorithms spread harmful health misinformation. This includes disclosing the company’s earlier research. For example, the White House accused Facebook’s algorithm of promoting vaccine disinformation during the 2021 COVID-19 epidemic.
The panel suggests that Meta continue its covid misinformation policy but be more transparent when eliminating disinformation across its platforms. In the report, the board encourages Meta to publish information on government demands to evaluate public health emergency content amid fears that the COVID-19 pandemic has been exploited to “erode the tenets of democracy.”
Meta sought an Oversight Board review in July 2022 to determine if it should be less restrictive when removing fraudulent covid-related content or stop removing it altogether to “better align with its values and human rights responsibilities.” Meta’s misinformation policy prohibits content that “risks imminent physical harm,” interferes with political processes, or promotes “certain highly deceptive manipulated media.” Joking that “only Brad Pitt’s blood can cure covid” is likewise banned.
As long as the World Health Organization declares covid an international public health emergency, the Oversight Board recommends Meta maintain its current stance. “Meta’s insistence that it takes a single, global approach” to policy modifications affected this recommendation because Meta argues it cannot enforce country- or region-specific changes. The board also finds that removing covid-related misinformation amid a global health emergency and protecting users from physical harm is consistent with Meta’s values and human rights duties.
The Oversight Board proposes that Meta encourage independent platform research and increase transparency about removal choices.