Tell Meta to fix the moderation system for high-profile users like Donald Trump

Meta has been told that its treatment of high-profile users, such as former US President Donald Trump, has left dangerous content online, serving commercial interests at the expense of its human rights obligations.

A damning report published Tuesday from the company’s oversight board — a “Supreme Court”-style body set up by the parent company of Facebook, Instagram and WhatsApp to rule on sensitive moderation issues — urged the social media giant to make “significant” changes to its internal system for reviewing content from politicians and celebrities. and its business partners.

The board, which began evaluating cases last year, has been coordinated by the tech giant’s head of policy and former UK deputy prime minister Sir Nick Clegg and Make independent judgments on salient oversight issues as well as recommendations on specific policies.

The board was asked to look into the system after Wall Street Journal and whistleblower Frances Hogan exposed its existence last year, sparking concerns that Meta was giving preferential treatment to elite figures.

Clegg also has until Jan. 7 to decide whether to allow Trump to return to the podium following a separate board recommendation.

After a lengthy investigation that spanned more than a year, the Board of Directors demanded this meta Scrutinize more closely those on the so-called “comprehensive review” list, and be more transparent about its review procedures.

The report is one of the most in-depth investigations yet into Meta’s moderation issues, as the independent body – which includes 20 journalists, academics and politicians – grapples with concerns it has too little power to hold the company to account.

See also  SQ inventory declines as operating profit, the payment volume for squared inventory estimates

It’s piling pressure on CEO Mark Zuckerberg, who announced his plans last month Cut 11,000 employees Amid declining revenue and growth, to ensure that Meta content is fairly monitored.

Meta has already started revamping the system. In a blog post on Tuesday, Clegg said it was originally developed to “check back where there might be a higher risk of a bug going wrong or where the potential impact of an bug might be particularly severe.” He added that the company has now developed a more standardized system, with more controls and annual reviews.

It remains unclear how many people are on the secret list. The Wall Street Journal, which first published the list, estimated that by 2020, there would be 5.8 million users listed. Meta had previously said there were 666,000 as of October 2021.

The system meant that content posted by well-known figures such as Trump and Elizabeth Warren would remain on the platforms until reviewed by human moderators, even if the messages were automatically removed if they were posted by a regular user.

The report found that it would take an average of five days for this human review to take place, with content left on the platform during this time and, in one case, up to seven months.

The board said Meta’s “understanding of the practical implications of the software was lacking,” adding that the company had failed to assess whether the system was working as intended.

The board also accused the company of providing “inadequate” responses to the investigation, with a response sometimes taking months.

See also  Twitter shareholders vote for Elon Musk's $44 billion takeover

The council cited a Wall Street Journal report that detailed how Brazilian soccer player Neymar posted intimate, non-consensual photos of another person to his Facebook and Instagram accounts, which were viewed more than 50 million times before removal. According to Meta, this was due to “a delay in reviewing the content due to a backlog at the time”.

Thomas Hughes, director of the Oversight Board, said the Neymar incident was one example of how business partnerships affected moderation processes.

“It opens up concerns . . . about the relationships between people in the company and whether that will influence decision-making.”

“There may have been a confusion of different interests in this comprehensive review process,” he added.

The report follows the previous public tensions between the board and Meta after the previous The social media company blamed September 2021 to withhold information from the system. Many see the board as an attempt to create distance between company executives and difficult decisions about freedom of speech.

Meta now has 90 days to respond to recommendations.

Leave a Reply

Your email address will not be published. Required fields are marked *