A collection of research papers published on Thursday suggest that the algorithms driving Facebook and Instagram are not solely responsible for the political polarization on these platforms, contrary to previous beliefs.
When these findings are viewed together, the studies indicate that Facebook users purposely seek out content that aligns with their own views. This creates “echo chambers” where different political groups can rely on, engage with, and consume divergent sources of information and misinformation, according to the research.
These papers, published in Science and Nature, had unprecedented access to Facebook and Instagram data surrounding the 2020 election. The researchers modified different parts of the platforms’ algorithms to examine their impact on people’s political beliefs and polarization.
Despite algorithms being heavily criticized for their effect on politics and calls for social media regulation, the research found that the algorithms themselves do not significantly contribute to polarization.
“We have discovered that algorithms have an incredibly influential role in shaping users’ experiences on these platforms, and there is evident ideological segregation when it comes to exposure to political news,” explained Talia Jomini Stroud, a lead researcher from the University of Texas, in an interview with The Associated Press. “We also found that popular proposals to change social media algorithms did not have an impact on political attitudes.”
Social media algorithms typically recommend content to users based on their preferences and expected interactions. This has raised concerns that algorithms perpetuate cycles of disinformation, feeding users with more misleading and false content that reinforces their existing political beliefs.
Disputes over the regulation of Facebook’s algorithms and its compensation of content creators, including news outlets, have already led to certain news feeds being removed by the company in Canada and raised similar threats in California.
However, the research found that disabling the algorithm’s suggestive function and instead showing users content chronologically did not reduce polarization. Similarly, when content sharing was disabled on users’ feeds, polarization remained unchanged while the distribution of misinformation decreased significantly.
One study attempted to decrease the amount of political content shown to users from their own political ideology, but this had little impact on polarization and political opinion. Interestingly, all of the studies saw users overall decrease their use of the platforms.
David Lazer, a professor from Northwestern University who worked on all four papers, told The Associated Press that algorithms primarily serve users content they already want to see, “facilitating their existing inclinations.”
Meta, the company that owns Facebook, praised the studies in a company memo, asserting that they prove the platforms’ algorithms are not malicious.
“Despite common claims that social media is ‘destroying democracy,’ the evidence from these and many other studies suggests something different,” Meta stated.
Critics of Meta, the social media giant that owns Facebook, dismissed the studies as “limited,” noting that researchers were given access to specific data selected by Meta.
“Meta executives are using this limited research as a means to deflect blame for increasing political polarization and violence,” stated Nora Benavidez, senior counsel to the nonprofit Free Press. “Meta-endorsed studies that examine narrow time periods should not be used as an excuse to allow the spread of falsehoods.”
Together, these studies also shed light on the behavioral tendencies of users with different political beliefs. For instance, conservatives are more likely to read and share misinformation and have access to a wider range of sources catering to their views.
The research stated that about 97% of sites spreading misinformation were more popular among conservatives than liberals.
Lazer deemed Meta’s restrictions on data access reasonable, primarily for user privacy reasons, and mentioned that more findings are forthcoming.
“There is no study like this one,” he said. “There has been much rhetoric surrounding this topic, but the research has been quite limited in many ways.”
© 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.