Is Facebook Prioritizing Profits over User Safety?

Is Facebook Prioritizing Profits over User Safety?

Fernanda Matias, Editor of Passion

Social media platforms are notorious for their addictive user face and mind-numbing scrolling spirals. Toxic posts and advertisements plague the platform, catalyzing mental health issues in users, especially teenage girls. Despite the prevalence of this issue, Facebook has been accused of prioritizing profits over user safety.

 

Facebook’s Whistleblower, Francis Haugen, was a former Facebook employee who anonymously filed complaints with federal law enforcement. The complaints state that Facebook’s own research shows that it amplifies hate, misinformation, and political turmoil, but the company conceals its data that demonstrates this. Among these complaints, Haugen said that Instagram harms teenage girls

 

“The thing I saw with Facebook over and over again,” Haugen says, “was there were conflicts of interest between what was good for the public, and what was good for Facebook, and Facebook over and over again chose to optimize for its own interest–like making more money.”

 

Haugen, 37, is a data scientist from Iowa, with a degree in computer engineering and a Harvard master’s degree in business. For 15 years, Haugen has worked for companies such as Google and Pinterest and says that Facebook’s lackluster effort towards ameliorating user safety is significantly worse than what she has seen at other companies. 

 

In order to prove her findings and allow people on the “outside” to understand the extent of Facebook’s dereliction towards user safety, Haugen copied tens of thousands of pages of Facebook internal research. This data shows that Facebook is lying to the public about advancing efforts towards eliminating the hate, violence, and misinformation that plagues the platform. 

 

One study conducted this year says, “…we [Facebook] estimate that we may action as little as 3-5% of hate and ~0.6% of V&I [Violence and Incitement] on Facebook despite being the best in the world at it.” Despite having conspicuous data and a clear understanding of the repercussions that come with neglection of user safety, this piece of evidence suggests that Facebook fails to rectify its platform’s issues.  

 

Another one of the documents states that, “We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world.” Not only is the hate that plagues Facebook degrading our society, but it is also perpetuating ethnic violence. In 2018, the military used Facebook to launch a genocide in Myanmar. Facebook has also been accused of fueling violence in Ethiopia.

 

Misinformation, as well, is prevalent in Facebook. Especially in the wake of the Covid-19 pandemic, conspiracy theories and misinformation have surged. Haugen was recruited by Facebook in 2019 to work for civic integrity, dealing with misinformation that threatened elections. After the 2020 election, Facebook decided to remove civic integrity as there were no riots. However, a few months later, the insurrection occurred on January 6, 2021. There was evidence of the protesters organizing the revolt on Facebook.

 

Facebook’s user insecurity stems from a change made in 2018 in the platform’s algorithm, which determines what consumers see in their news feed. Essentially, Facebook optimizes content that generates engagement, posting items on users’ feeds that they have interacted with the most in the past. Parties “…feel strongly that the change in the algorithm has forced them to skew negative in their communications on Facebook… leading them into more extreme policy positions.”

 

Therefore, misinformation and “angry content” are captivating to people, maintaining their interest and exacerbating their addiction to social media. Although a change in the content that is presented to users would be safer, it would be less enticing, leading to less user engagement and, ultimately, diminishing profits. 

 

However, the repercussions of Facebook’s dereliction towards user safety extend beyond the Facebook app, but to Instagram as well. 

 

One Facebook internal study says that 13 ½% of teen girls say Instagram exacerbates thoughts of suicide, with 17% of teen girls saying Instagram worsens behaviors of eating disorders. In addition to this, data suggests that by consuming this content, young women become more depressed, leading them to become more addicted to Instagram. This negative feedback loop perpetuates cycles of depression and eating disorders. 

 

Ultimately, the effects of Facebook’s toxic user interface are widespread, and the company’s neglect of user safety has only perpetuated the issue. Harmful and dangerous repercussions continue to transpire as a result of user consumption of misinformation.