As the company faces criticism for its own practices, an executive at Meta, formerly known as Facebook, said on Sunday that “individual humans” are to blame for spreading misinformation. Andrew Bosworth, who will become Meta’s CTO next year, told Axios on HBO that Facebook jobs can only do so much to moderate speech on the platform.
“If we took every single dollar and human that we had, it wouldn’t eliminate people seeing speech that they didn’t like on the platform. It wouldn’t eliminate every opportunity that somebody had to use the platform maliciously,” he said.
Bosworth, Meta’s vice president of augmented and virtual reality, acknowledged that speech can be harmful, but that the responsibility for what is shared on its platforms ultimately falls on people and society.
“Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing,” Bosworth told Axios. “I don’t feel comfortable at all saying they don’t have a voice because I don’t agree with what they said, I don’t like what they said.”
Facebook has been heavily chastised for false or harmful content shared on its platform, particularly in relation to the COVID-19 pandemic and the 2020 presidential election.
According to a September study, posts on Facebook from misinformation sources received six times the engagement as posts from reputable news sources. Another report released the same month discovered that troll farms spreading misinformation reached approximately 140 million Americans a month before the 2020 election.
When asked by Axios on HBO jobs host Ina Fried how Facebook has contributed to vaccine hesitancy, Bosworth cited Facebook’s campaign to disseminate accurate vaccine information. He said if people want to follow less reliable information they see shared by others on Facebook, “that’s their choice.”
“They are allowed to do that. You have an issue with those people. You don’t have an issue with Facebook. You can’t put that on me,” he said.
Source: Business Insider