Oversight Board reverses Facebook removal of post touting hydroxychloroquine in COVID-19 treatment
The board found the social media giant's misinformation and imminent harm rule is too vague and recommended the platform consolidate and clarify its standards on health misinformation in one place.
Facebook's independent Oversight Board has reversed the social media platform's decision to remove an October 2020 post pertaining to the drug hydroxychloroquine in the treatment of COVID-19.
"In October 2020, a user posted a video and accompanying text in French in a public Facebook group related to COVID-19," the board explained on its website. "The post alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products), which refused to authorize hydroxychloroquine combined with azithromycin for use against COVID-19, but authorized and promoted remdesivir. The user criticized the lack of a health strategy in France and stated that “[Didier] Raoult’s cure” is being used elsewhere to save lives. The user’s post also questioned what society had to lose by allowing doctors to prescribe in an emergency a “harmless drug” when the first symptoms of COVID-19 appear."
While the person's post pushed back against a government policy, it did not urge people to obtain or take medicine without a prescription, the board noted.
"[The] user was opposing a governmental policy and aimed to change that policy," the board said in explaining its ruling. "The combination of medicines that the post claims constitute a cure are not available without a prescription in France and the content does not encourage people to buy or take drugs without a prescription. Considering these and other contextual factors, the Board noted that Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards."
Facebook also failed to show why it did not opt for a less severe remedy than removing the post from the platform, the panel found.
"Given that Facebook has a range of tools to deal with misinformation, such as providing users with additional context, the company failed to demonstrate why it did not choose a less intrusive option than removing the content," the board explained.
The board also determined that the social media giant's misinformation and imminent harm rule is too vague and recommended that the platform consolidate and clarify its standards on health misinformation in one place.
"The Board also found Facebook's misinformation and imminent harm rule, which this post is said to have violated, to be inappropriately vague and inconsistent with international human rights standards," the panel said. "A patchwork of policies found on different parts of Facebook's website make it difficult for users to understand what content is prohibited. Changes to Facebook's COVID-19 policies announced in the company's Newsroom have not always been reflected in its Community Standards, while some of these changes even appear to contradict them."