Facebook reportedly killed a feature meant to expose its users to different perspectives over fears of being seen as having a liberal bias, according to a Wall Street Journal report published Sunday.
Facebook’s most prominent conservative executive and global policy head Joel Kaplan reportedly pushed for the feature to be squashed, according to the Journal.
Facebook has long faced criticism of having a liberal slant, which has ramped up as the company’s top executives have had to face lawmakers to testify over its use of user data. Kaplan, in particular, has emerged as a controversial figure as the company has entered the limelight, most notably after attending Justice Brett Kavanaugh’s hearing over alleged sexual misconduct. The Journal report found that Kaplan, a former aide to President George W. Bush, had key input on the types of information that would be exposed to Facebook’s users.
Kaplan reportedly raised concerns over an internal analysis by Facebook about how its users were exposed to a range of information. The report found that Facebook’s right-leaning users were less likely to be exposed to different viewpoints and therefore more polarized, according to the Journal. Facebook’s so-called Common Ground initiative, which proposed in part to boost articles in the News Feed that were highly liked and commented by a range of users across the political spectrum, was meant to bring a greater range of perspective to users. Kaplan argued that the initiative would quiet conservative voices to a disproportionate degree, the Journal reported.
In a statement, a Facebook spokesperson said, “Understanding a wide variety of perspectives is an important part of our product development process and is essential for building products and services that serve everyone. The public policy team, led by Joel Kaplan, is tasked with understanding the perspectives of groups, regulators, governments, NGOs and other stakeholders from around the world and using that knowledge to inform product discussions and decisions. The team plays an essential role in ensuring that we adopt objective standards and that our policies are applied fairly and consistently.”
Facebook has tried other ideas to stop the spread of misinformation on its site and help users distinguish quality information from that which is less well-sourced. It has added a number of third-party fact-checkers to help it rank information in the News Feed and give context to the source of various articles. The program has faced criticism of its own, including that it can only review a limited amount of information based on the capacity of its fact-checkers.