Video appears to expose Facebook bias: 'If someone is wearing a MAGA Hat, I am going to delete them'
An undercover video puts on display the political activism coming out of the Facebook content moderating team
The Facts Inside Our Reporter’s Notebook
In a new video by Project Veritas, an undercover investigation inside Facebook appears to capture content moderators admitting to deleting posts and comments that support the president and other conservative causes.
A Facebook employee tells Project Veritas that 75% of the posts he sees selected by Facebook's algorithm for review are in support of President Trump and other conservative pages.
"They are not at all shy to exercise their political will," said the employee in the video says of his coworkers.
Though the employee acknowledges being unsure of how the company's proprietary algorithm selects which posts get flagged for review, he says the bias is clear and exercised by whoever designed it, given the dramatic ratio of conservative-to-liberal posts that find their way onto the review queue.
One content moderator, caught on hidden camera is heard saying, "If someone is wearing a MAGA hat, I am going to delete them for terrorism."
The video also details internal Facebook memos that instruct content moderators to make exceptions for the words of some left-leaning political pundits and violent images of the president, that would otherwise be removed.
In 2018, Facebook CEO Mark Zuckerberg testified before the Senate Judiciary and Commerce committees that he understood the concern that his company was employing a pattern of political bias by way of the posts they flagged and removed.
"Facebook in the tech industry are located in Silicon Valley, which is an extremely left-leaning place, and I — this is actually a concern that I have and that I try to root out in the company, is making sure that we do not have any bias in the work that we do, and I think it is a fair concern that people would at least wonder about," Zuckerberg said.
The apparent findings of the hidden camera video appear to call into question Zuckerberg's testimony that the platform is politically neutral and removes only content that could potentially cause harm, i.e., relating to terrorism or hate speech.