As Hao wrote, a study from New York University of partisan publishers’ Facebook pages found “those that regularly posted political misinformation received the most engagement in the lead-up to the 2020 US presidential election and the Capitol riots.”
Zuckerberg, after saying that “a bunch of inaccurate things” about Facebook’s incentives for allowing and amplifying misinformation and polarizing content had been shared at the hearing by members of Congress, added:
“People don’t want to see misinformation or divisive content on our services. People don’t want to see clickbait and things like that. While it may be true that people may be more likely to click on it in the short term, it’s not good for our business or our product or our community for it to be there.”
His answer is a common Facebook talking point and skirts the fact that the company has not undertaken a centralized, coordinated effort to examine and reduce the way its recommendation systems amplify misinformation. To learn more, read Hao’s reporting.
Zuckerberg’s comments came during the House Committee on Energy and Commerce hearing on disinformation, where members of Congress asked Zuckerberg, Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey about the spread of misinformation about the US election in November, the January 6 attack on the Capitol building, and covid vaccines, among other things.
As has become common in these hearings, conservative legislators also questioned the CEOs about perceived anti-conservative bias on their platforms, a longtime right-wing claim that data doesn’t support.
Recent Comments