Facebook whistleblowers say there will always be more violent events triggered by social media.

Frances Haugen was speaking in London before the British Parliament as part of an investigation into an online safety bill.

Image: British Parliament 2021 / Annabel Moeller

According to Facebook whistleblower Frances Haugen, social media platforms that use opaque algorithms to spread harmful content should be curtailed.

According to Hogen, events such as the Capitol riots and other social media-fueled conflicts herald future events.

“There is no doubt that events around the world, such as Myanmar and Ethiopia, are the opening chapters,” Hogen said. “Engagement-based assessment is to do two things. One is to prioritize and amplify the extreme content that is fragmented and polar. The other is to focus it.”

Haugen was speaking in London in front of the British Parliament as part of an investigation into an online safety bill submitted by the government earlier this year. The bill proposes to force companies to protect users from harmful content, from revenge pornography to disinformation, through hate speech and racist abuse.

Parliamentarians were taking evidence from Hogen. She recently came to the fore because a whistleblower behind the bomb leaked an internal document. Obtained from Facebook while working as Lead Product Manager for Facebook’s Citizen’s Misinformation Team, including internal files, draft presentations, research, and staff communication.

The leak, now known as a Facebook file, The Wall Street Journal Investigate a variety of topics, including the use of different content moderation policies for prominent users, the dissemination of false information, and the impact of Instagram on the mental health of teens. This disclosure also triggered a Senate investigation into Facebook’s operations.

In this regard, Mr Hogen said the government needed to tighten and implement stricter regulations. “It’s time to act, so I’m out now,” Hogen said. “Facebook’s failure makes it difficult to act.”

Haugen argued that the social media giant “definitely” exacerbated hatred. This is especially because it uses an engagement-based ranking algorithm.

This can create an echo chamber effect, as extreme content tends to be more viral. This means that users will be pushed down the rabbit hole, consuming more and more polarized and fragmented content.

For example, a person looking for a healthy recipe may start looking at content related to loss of appetite. And anyone reading right-wing content could be pushed towards the far-right post.The issue is not limited to Facebook: similar claims Previously created by former Google software engineer Guillaume Chaslot For YouTube’s recommendation algorithm.

“Facebook’s danger isn’t about individuals saying bad things, it’s about amplification systems that disproportionately give extreme, polarization to those who call it the biggest megaphone in the room,” Hogen said. He said.

According to whistleblowers, the problem has extended to paid advertising. Haugen says the current system is subsidizing hatred on social media platforms, as split ads are likely to generate engagement and are much cheaper to run “angry” advertising campaigns. I did.

This is what Facebook argued. The week before Haugen appeared in the British Parliament, the social media giant Publish the report The hate speech epidemic has fallen by almost 50% on the platform in the last three quarters and currently accounts for only 0.05% of all content viewed.

A Facebook spokeswoman said: “Contrary to what was discussed in the hearing, we always had a commercial incentive to remove harmful content from the site. Next to it is their ad, so we’re $ 13 billion. Invested and hired 40,000 people to do one job. It’s about keeping people safe with the app. ”

However, Haugen argues that Facebook does not self-regulate to protect its users, as lower engagement rates are inconsistent with the company’s business model. “Facebook is reluctant to accept even the slightest amount of profit sacrificed for security,” she said.

Government action required

Therefore, government action is required to prevent the case from escalating to more violent protests. Haugen praised Britain’s efforts in drafting an online safety bill, calling the proposed legislation on social platform regulation “world-leading,” and paying special attention to companies such as Facebook to protect users. He emphasized the need to mandate.

“I can’t imagine Mark not paying attention to what you’re doing,” Hogen was asked if the bill would awaken Facebook CEO Mark Zuckerberg at night. ..

Haugen recommended that the online safety bill should include a mandatory risk assessment for engagement-based ranking systems. This is overseen by external regulators rather than the board of directors within Facebook. Concerns about paid advertising are included in the bill.

She also demanded that Facebook make the data available to outside researchers, allowing them to investigate potential issues from the outside, and the Facebook group would exceed a certain number of users. If so, we recommend that you force moderation.

Finally, Haugen addressed the controversial end-to-end encryption issue since the bill was published.Freedom of speech groups, in effect, have a bill End-to-end encryption should be abolishedAt the expense of user privacy, to allow social media platforms to scan private messages and search for harmful content.

“I support access to end-to-end encryption and I use open source end-to-end encryption every day,” says Haugen. “My social support network currently uses an open source end-to-end encryption service.”

Facebook’s end-to-end encryption plan is problematic because the product is not open source and it is impossible to verify the extent to which users are effectively protected. This could allow some users to share sensitive information online while thinking that the data is encrypted, but she claims that it could actually be read by a third party. bottom.

Facebook, as part of it, welcomes the UK’s attempts to regulate social media platforms. A Facebook spokeswoman said, “There are rules for harmful content and we publish regular transparency reports, but the industry as a whole prevents companies like us from making these decisions ourselves. I agree with the need for regulation. ” “The UK is one of the leading countries and we are pleased that the online safety bill is moving forward.” Facebook whistleblowers say there will always be more violent events triggered by social media.

Back to top button