I have often wondered about the mechanism for cultism in our era, whether for Trump, QAnon or anything else. I know that Fox News and talk radio are huge purveyors but they mostly hit retired people and people who don’t have jobs that allow them to listen to the radio all day. When you look at the rallies, it’s not just older people. There are mostly middle aged people with a few young ones sprinkled in among the elderly. These people are getting their disinformation from other sources and it’s pretty clear that it starts with Facebook.
So, Following up on Tm’s piece below, I’ll just add my two cents:
In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smith’s account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.
Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.
Smith didn’t follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smith’s feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation.
Smith wasn’t a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platform’s role in misinforming and polarizing users through its recommendations systems.
That researcher said Smith’s Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.”
The body of research consistently found Facebook pushed some users into “rabbit holes,” increasingly narrow echo chambers where violent conspiracy theories thrived. People radicalized through these rabbit holes make up a small slice of total users, but at Facebook’s scale, that can mean millions of individuals.
The findings, communicated in a report titled “Carol’s Journey to QAnon,” were among thousands of pages of documents included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by legal counsel for Frances Haugen, who worked as a Facebook product manager until May. Haugen is now asserting whistleblower status and has filed several specific complaints that Facebook puts profit over public safety. Earlier this month, she testified about her claims before a Senate subcommittee.
Versions of the disclosures — which redacted the names of researchers, including the author of “Carol’s Journey to QAnon” — were shared digitally and reviewed by a consortium of news organizations, including NBC News. The Wall Street Journal published a series of reports based on many of the documents last month.
“While this was a study of one hypothetical user, it is a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform,” a Facebook spokesperson said in a response to emailed questions.
Facebook CEO Mark Zuckerberg has broadly denied Haugen’s claims, defending his company’s “industry-leading research program” and its commitment to “identify important issues and work on them.” The documents released by Haugen partly support those claims, but they also highlight the frustrations of some of the employees engaged in that research.
Among Haugen’s disclosures are research, reports and internal posts that suggest Facebook has long known its algorithms and recommendation systems push some users to extremes. And while some managers and executives ignored the internal warnings, anti-vaccine groups, conspiracy theory movements and disinformation agents took advantage of their permissiveness, threatening public health, personal safety and democracy at large.
“These documents effectively confirm what outside researchers were saying for years prior, which was often dismissed by Facebook,” said Renée DiResta, technical research manager at the Stanford Internet Observatory and one of the earliest harbingers of the risks of Facebook’s recommendation algorithms.
Facebook’s own research shows how easily a relatively small group of users has been able to hijack the platform, and for DiResta, it settles any remaining question about Facebook’s role in the growth of conspiracy networks.
“Facebook literally helped facilitate a cult,” she said.
Forget Q, although that’s a fascinating case. Look at the general right wing disinformation pipeline on that platform and it’s obvious that a whole lot of people were drawn into the Trump by those same algorithms.
Sure, wingnuttia is a pre-existing condition for the most part. It’s a tribe people are born into or marry into. But Facebook fed them Trumpism by the truckload and they just ate it up.
It’s hard to know where the cycle begins and ends with right wing media — Talk radio to Fox to Facebook or does it go from Facebook to Fox to talk radio or some other combination. I don’t think it matters. But Facebook exposes it to people who are not necessarily all that interested in politics, which you have to be to tune in to Fox and hate radio. And that’s how new people get recruited into the cult. They don’t choose it. It gets spoonfed to them and they get indoctrinated without realizing that what they are reading is lies and propaganda.
Update — The stuff about Facebook and January 6th is going to hit the fan this week. It’s bad:
Measures of online mayhem surged alarmingly on Facebook, with user reports of “false news” hitting nearly 40,000 per hour, an internal report that day showed. On Facebook-owned Instagram, the account reported most often for inciting violence was @realdonaldtrump — the president’s official account, the report showed.
Facebook has never publicly disclosed what it knows about how its platforms, including Instagram and WhatsApp, helped fuel that day’s mayhem. The company rejected its own Oversight Board’s recommendation that it study how its policies contributed to the violence and has yet to fully comply with requests for data from the congressional commission investigating the events.
But thousands of pages of internal company documents disclosed to the Securities and Exchange Commission by the whistleblower Frances Haugen offer important new evidence of Facebook’s role in the events. This story is based on those documents, aswell as on others independently obtained by The Washington Post, and on interviews with current and former Facebook employees. The documents include outraged posts on Workplace, an internal message system.
“This is not a new problem,” one unnamed employee fumed on Workplace on Jan. 6. “We have been watching this behavior from politicians like Trump, and the — at best — wishy washy actions of company leadership, for years now. We have been reading the [farewell] posts from trusted, experienced and loved colleagues who write that they simply cannot conscience working for a company that does not do more to mitigate the negative effects on its platform.”