Technology

A new study found that Facebook’s Pages and Groups shape its ideological echo chambers | TechCrunch


New research published Thursday offers an unprecedented dive into political behavior across Facebook and Instagram — two major online hubs where people express and engage with their political beliefs. The studies, published by an interdisciplinary set of researchers working in tandem with internal groups at Meta, encompasses four papers published in Science and Nature examining behavior on both platforms around the time of the 2020 U.S. election.

The papers — only the first wave of many to be published in the coming months — grew out of what’s known as the 2020 Facebook and Instagram Election Study (FIES), an unusual collaboration between Meta and the scientific research community. On the academic side, the project was spearheaded by University of Texas Professor Talia Jomini Stroud of the school’s Center for Media Engagement, and NYU’s Professor Joshua A. Tucker, who serves as co-director of its Center for Social Media and Politics.

The findings are myriad and complex.

In one study on Facebook’s ideological echo chambers, researchers sought insight about the extent to which the platform’s users were exposed only to content that they were politically aligned with. “Our analyses highlight that Facebook, as a social and informational setting, is substantially segregated ideologically—far more than previous research on internet news consumption based on browsing behavior has found,” the researchers wrote.

At least two very interesting specific findings emerged out of the data. First, the researchers found that content posted in Facebook Groups and Pages displayed much more “ideological segregation” compared to content posted by users’ friends. “Pages and Groups contribute much more to segregation and audience polarization than users,” the researchers wrote.

That might be intuitive, but both Groups and Pages have historically played a massive role in distributing misinformation and helping like-minded users rally around dangerous shared interests, including QAnon, anti-government militias (like the Proud Boys, who relied on Facebook for recruitment) and potentially life-threatening health conspiracies. Misinformation and extremism experts have long raised concerns about the role of the two Facebook products in political polarization and sowing conspiracies.

“Our results uncover the influence that two key affordances of Facebook—Pages and Groups—have in shaping the online information environment,” the researchers wrote. “Pages and Groups benefit from the easy reuse of content from established producers of political news and provide a curation mechanism by which ideologically consistent content from a wide variety of sources can be redistributed.”

That study also found a major asymmetry between liberal and conservative political content on Facebook. The researchers found that a “far larger” share of conservative Facebook news content was determined to be false by Meta’s third-party fact-checking system, a result that demonstrates how conservative Facebook users are exposed to far more online political misinformation compared to their left-leaning counterparts.

“… Misinformation shared by Pages and Groups has audiences that are more homogeneous and completely concentrated on the right,” the researchers wrote.

In a different experiment conducted with Meta’s cooperation, participants on Facebook and Instagram saw their algorithmic feeds replaced with a reverse chronological feed — often the rallying cry of those fed up with social media’s endless scrolling and addictive designs. The experience didn’t actually move the needle on the how the users felt about politics, how politically engaged they were offline or how much knowledge they wound up having about politics.

In that experiment, there was one major change for users who were given the reverse chronological feed. “We found that users in the Chronological Feed group spent dramatically less time on Facebook and Instagram,” the authors wrote, a result that underlines how Meta juices engagement — and encourages addictive behavioral tendencies — by mixing content in an algorithmic jumble.

These findings are just a sample of the current results, and a fraction of what’s to come in future papers. Meta has been spinning the results across the new studies as a win — a view that flattens complex findings into what is essentially a publicity stunt. Regardless of Meta’s interpretation of the results and the admittedly odd arrangement between the researchers and the company, this data forms an essential foundation for future social media research.





Source link