"Carol's Journey' What Facebook could have known about the way it influenced users

Internal documents indicate Facebook has been around for a long time and knows its algorithms and recommendations systems can push users to the edge.

%22Carol%27s%20Journey%27%20What%20Facebook%20could%20have%20known%20about%20the%20way%20it%20influenced%20users
source: https://ibb.co/wRkN4dz

In the summer of 2019, a brand new Facebook user known as Carol Smith signed up for the platform. She described her as a conservative mom of Wilmington, North Carolina. Smith's profile indicated an passion for politics, parenting and Christianity and she also followed some of her most loved brands, such as Fox News and then-President Donald Trump.

Although Smith was never interested for conspiracy theories within only two days Facebook recommended Smith join groups that were dedicated to QAnon an expansive and bogus conspiracy theory movement which asserted that Trump was secretly rescuing humanity from the wrath of a cult of homosexuals and Satanists.

Smith did not follow the suggested QAnon group, however the algorithm Facebook employed to determine how she should interact with the platform was pushed forward in the same way. Within a week Smith's feeds were packed with groups and pages that violated Facebook's guidelines, including those that prohibit hatred speech and disinformation.

Smith wasn't real. The researcher who worked for Facebook created the account as well as those of others fictitious "test users" in 2019 and 2020 in an experiment investigating the role of the platform in polarizing and misinforming users with its recommendations system.

Smith's experiences on Facebook were "a barrage of extreme, conspiratorial, and graphic content."

Research has repeatedly found that Facebook forced some people to "rabbit holes," increasingly tiny echo chambers that were the place where conspiratorial theories that were violent thrived. The people who became radicalized by these rabbit holes comprise just a small portion of the total users, but on the size of Facebook, that could translate to millions of users.

The results, which were published in a document titled "Carol's Journey to QAnon," included among the many thousands of documents that were included in disclosures submitted by the Securities and Exchange Commission and sent to Congress in the form of a redacted version by lawyers for Frances Haugen, who worked as an Facebook product manager up until May. Haugen is now claiming her status as a whistleblower, and has filed a number of specific complaints alleging that Facebook is putting profit before public security. In the last month, she testified on her allegations before an Senate subcommittee.

The disclosures' versions which omitted research names, such as those who wrote "Carol's Journey to QAnon" The documents were distributed digitally and analyzed by a group of news agencies including NBC News. The Wall Street Journal published several reports using a variety of documents in the last month.

"While this was a study of one hypothetical user, it is a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform," an official from Facebook Facebook spokesperson stated in response to email inquiries.

Facebook President Mark Zuckerberg has broadly denied Haugen's assertions, while defending the firm's "industry-leading research program" and its determination to "identify important issues and work on them." The documents published by Haugen are in part a confirmation of those assertions, but also show the discontent of employees who are involved in the research.

Facebook whistleblower: The company put profit before people

OCT. 6, 202102:41

In Haugen's revelations are research and reports as well as internal posts that indicate Facebook is well-aware of its algorithms and recommendations system push certain users to the edge. In the meantime, executives and managers ignored the warnings from within such as conspiracy theories, anti-vaccine groups organizations and disinformation agents profited from their freedom of speech, threatening the health of people, their personal safety as well as democracy in general.

"These documents effectively confirm what outside researchers were saying for years prior, which was often dismissed by Facebook," said Renee DiResta, technical research manager at the Stanford Internet Observatory and one early warnings of the dangers of Facebook's recommendation algorithm.

Facebook's own research reveals how easily a small segment of users was in a position to take over the platform. And for DiResta it clarifies any remaining questions regarding Facebook's involvement in the development of the conspiracies networks.

"Facebook literally helped facilitate a cult," she claimed.

"A pattern at Facebook'

For years, researchers from the company were conducting studies similar to Carol Smith's to test the impact of the platform on users, as per documents reviewed through NBC News.

The research conducted by the internal team repeatedly revealed that the recommendation tools could push users into extremist groups. research that helped inform the policy and make adjustments in recommendations as well as news feed ranking. The rankings are a tangled, constantly evolving system that is widely referred to as "the algorithm" that pushes information to users. However, the research at the period was far from encouraging any change in the pages and groups.

The reluctance to do so was an indication of "a pattern at Facebook," Haugen explained to reporters this month. "They want the shortest path between their current policies and any action."

"There is great hesitancy to proactively solve problems," Haugen said.

A Facebook spokesperson disclaimed that the study hadn't caused the company to take action and also referred to the changes made to groups that were announced in March.

As QAnon followers engaged in violence in 2019 and in 2020 Facebook pages and groups related with the theory of conspiracy have exploded as per internal records. The documents also detail how Facebook's teams at Facebook made specific steps to comprehend and address these issuesSome of which Facebook employees viewed as being not enough, and too late.

In summer 2020 Facebook has hosted thousands of private Qanon groups and pages, and had millions of followers and members in the unpublished internal investigation.

One one-year after the FBI classified QAnon as a possible domestic terrorist threat following a series of kidnappings planned, standoffs as well as harassment campaigns and shootouts Facebook declared QAnon as a "Violence Inciting Conspiracy Network" and removed it from its platform as well as militant social groups and militias. A small group of people from the departments of Facebook found that its platforms were hosting hundreds of advertisements in Facebook and Instagram that cost million of dollars as well as millions of users "praising, supporting, or representing" the conspiracy theory.

The Facebook spokesperson explained by email, that the company had "taken a more aggressive approach in how we reduce content that is likely to violate our policies, in addition to not recommending Groups, Pages or people that regularly post content that is likely to violate our policies."

For many employees at Facebook the company, the action came too late, as per posts posted on Workplace the private message board.

"We've known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups," one integrity researcher who's name was removed in a blog post in which she announced her departure from the company. "This small group has risen into national prominence, including the QAnon congressional candidate and hashtags of QAnon, as well as groups that are being discussed within the mainstream. We were prepared to take action just after the situation have deteriorated to a desperate situation."

"We should be concerned"

Although Facebook's ban at first seemed to be efficient, the issue remained that the elimination of pages and groups didn't erase QAnon's most fervent users who continued to gather on the platform.

"There was enough evidence to raise red flags in the expert community that Facebook and other platforms failed to address QAnon's violent extremist dimension," said Marc-Andre Argentino, a research Fellow at King's College London's International Centre for the Study of Radicalisation who has conducted extensive research on QAnon.

Believers simply rebranded as anti-child-trafficking groups or migrated to other communities, including those around the anti-vaccine movement.

The idea was an obvious match. Researchers within Facebook investigating the specific communities discovered conspiracy theories that were violent and associated with Covid-19's hesitancy to vaccine. One study discovered that QAnon members were very concentrated in anti-vaccine community. Influencers against vaccinations also taken advantage of the pandemic and made use of Facebook's functions like groups and livestreaming to expand their following.

The documents on Facebook expose internal rage and discontent about the site's policies

The QAnon-believers also joined groups that endorsed the false claim that the election of 2020 was rigged. These groups offered a variety of bogus conspiracy theories that claimed that voters, Democrats and election officials were somehow swindling Trump from the second term. The new group, which was mostly organized via Facebook and Twitter, eventually engulfed in the U.S. Capitol on Jan. 6 according to a story included inside the document archive. The incident was first published in BuzzFeed News in April.

These groups were the most popular groups on Facebook as per the report. However, Facebook could not control the group's "meteoric growth," the researchers said, "because we were looking at each entity individually, rather than as a cohesive movement." A Facebook spokesperson said to BuzzFeed News it took many measures to reduce misinformation about elections but it was not able to detect all of the information.

Facebook's enforcement of its policies was "piecemeal," the team of researchers noted "we're building tools and protocols and having policy discussions to help us do this better next time."