Since its origin in 2006, Facebook has ruled the social media sphere, however, this success has not come without controversy and some unintended consequences.
It has long been known that Facebook and social media platforms owned under its parent company, Meta, hold a strong influence over politics and society as a whole. But it was not until whistleblower Frances Haugen leaked thousands of internal company documents that society saw the magnitude of Facebook’s harms.
News regarding Facebook’s internal documents first began circulating in September as The Wall Street Journal reported a series of articles regarding information found in leaked documents. This reporting came after an anonymous former Facebook employee filed complaints against the company, stating that “Facebook’s own research shows that it amplifies hate, misinformation, and political unrest—but the company hides what it knows.” Some of the biggest headlines to come from the Wall Street Journal’s so-called “Facebook Files” claimed that Facebook is aware of the toxicity towards the teenage demographic, Facebook services spread religious hate in India, and that Facebook’s algorithm encourages hate and the spread of extremist content.
At first, the origin of these documents was unknown until early October, until a 60-minute interview revealed Frances Haugen, the former Facebook product manager, as the Facebook whistleblower. In the interview, Haugen claimed that Facebook showed conflicts of interest between acting towards the good of the people or towards the good of the company, yet Facebook “over and over again chose to optimize for its own interests.”
Facebook has “evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world.” This comes after Mark Zuckerberg downplayed the severity of Facebook’s political impact in numerous interviews and public hearings.
The root of these problems, according to Haugen, started in 2018 when Facebook changed its algorithm. This change made it so their algorithm began prioritizing content that evoked maximum engagement or reaction from users. The algorithm began showing users content similar to posts they previously engaged in or reacted to, the stronger the reaction, the higher the amount of similar content seen. However, Facebook’s own research is showing that it is “easier to inspire people to anger than it is to other emotions.” By continuously feeding users more emotionally charged content, people are more likely to stay on the platform. In spite of this, Facebook has refused to change the problems with their algorithm, because “if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, [and Facebook] will make less money,” according to Haugen.
Soon after this interview, Haugen testified before Congress, using the documents as evidence to support the federal government imposing stricter regulations on Facebook and its connected apps. Haugen’s main arguments during the testimony were that Facebook is “disastrous for our children, for our public safety, for our privacy, and for our Democracy.” As of now, no new regulations have been put in place.