Facebook papers reveal deep conflict between profit and people

Image Collected

A consortium of 17 US news organisations released a series of reports on Monday detailing how Facebook and its platforms exacerbate the spread of misinformation and harmful content, and have failed to police abusive content globally.

Thousands of pages of internal documents provided to the US Congress by a former employee depict an internally conflicted company where data on the harms it causes are abundant, but solutions, much less the will to act on them, are halting at best.

In blistering testimony before US the Congress this month, Facebook whistle-blower Frances Haugen accused the social media company of sowing division and fuelling ethnic violence to pursue enormous profits.


The documents released on Monday pointed to numerous instances in which researchers and rank-and-file workers uncovered deep-seated problems that the company then overlooked or ignored.

Ms Haugen told members of the UK Parliament on Monday that Facebook has a “huge weak spot” in reporting issues up the chain of command. She also said Facebook is “making hate worse”.

Final responsibility rests with chief executive Mark Zuckerberg, who holds what one former employee described as dictatorial power over a corporation that collects data on and provides free services to about three billion people around the world.

But Mr Zuckerberg has failed to address declining engagement with Facebook in the US and in Western Europe — particularly among teenagers and young people, who now see it as an “outdated network".

In its quest to expand its reach and power, Facebook has pushed for higher user growth outside these regions.

But as it expanded into less familiar parts of the world, the company failed to address or anticipate the unintended consequences of signing up millions of new users without also providing staff and systems to identify and limit the spread of hate speech, misinformation and calls to violence.

In Afghanistan and Myanmar, for instance, extremist language has flourished due to a systemic lack of language support for content moderation. In Myanmar, it has been linked to atrocities committed against the country’s minority Rohingya Muslim population.

But Facebook appears unable to acknowledge, much less prevent, the real-world collateral damage accompanying its unfettered growth.

Those harms include shadowy algorithms that allow for the radicalisation of users through pervasive misinformation, which leads to extremism and enables human trafficking, teen suicide and more.

Internal efforts to mitigate such problems have often been pushed aside or abandoned when solutions conflict with growth — and, by extension, profit.

Ms Haugen told the UK Parliament that Facebook views safety as a “cost centre” and not an investment for growth.

Facebook, in a prepared statement released on Friday, denied claims it puts profit over user safety.

“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie,” the company said.

“The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”

Ms Haugen — who told the US Senate this month that Facebook’s products “harm children, stoke division and weaken our democracy” — said the company should declare “moral bankruptcy” if it is to move forward from all this.

At this stage, that seems unlikely.

Source: https://www.thenationalnews.com

Tags :

Share this news on: