In the lead-up to the 2020 presidential election, Meta set out to conduct a series of ambitious studies on the effects its platforms—Facebook and Instagram—have on the political beliefs of US-based users. Independent researchers from several universities were given unprecedented access to Meta’s data, and the power to change the feeds of tens of thousands of people in order to observe their behavior.
The researchers weren’t paid by Meta, but the company seemed pleased with the results, which were released today in four papers in Nature and Science. Nick Clegg, Meta’s president of global affairs, said in a statement that “the experimental findings add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization” or have “meaningful effects on” political views and behavior.
It’s a sweeping conclusion. But the studies are actually much narrower. Even though researchers were given more insight into Meta’s platforms than ever before—for many years, Meta considered such data too sensitive to make public—the studies released today leave open as many questions as they answer.
The studies focused on a specific period in the three months leading up to the 2020 presidential election. And while Andrew Guess, assistant professor of politics and public affairs at Princeton and one of the researchers whose findings appear in Science, noted that this is longer than most researchers get, it’s not long enough to be entirely representative of a user’s experience on the platform.
“We don’t know what would have happened had we been able to do these studies over a period of a year or two years,” Guess said at a press briefing earlier this week. More importantly, he said, there is no accounting for the fact that many users have had Facebook and Instagram accounts for upwards of a decade now. “This finding cannot tell us what the world would have been like if we hadn’t had social media around for the last 10 to 15 years or 15 or 20 years.”
There’s also the issue of the specific time frame the researchers were able to study—the run-up to an election in an atmosphere of intense political polarization.
“I think there are unanswered questions about whether these effects would hold outside of the election environment, whether they would hold in an election where Donald Trump wasn’t one of the candidates,” says Michael Wagner, a professor of journalism and communication at University of Wisconsin-Madison, who helped oversee Meta’s 2020 election project.
Meta’s Clegg also said that the research challenges “the now commonplace assertion that the ability to reshare content on social media drives polarization.”
Researchers weren’t quite so unequivocal. One of the studies published in Science found that resharing elevates “content from untrustworthy sources.” The same study showed that most of the misinformation caught by the platform’s third-party fact checkers is concentrated amongst and exclusively consumed by conservative users, which has no equivalent on the opposite side of the political aisle, according to an analysis of about 208 million users.
اكتشاف المزيد من نص كم
اشترك للحصول على أحدث التدوينات المرسلة إلى بريدك الإلكتروني.