Tuesday, November 30, 2021

Facebook’s ‘tier’ system decides who gets content moderation

Must Read

NYC man slashed at Times Square subway station

A homeless man was slashed in the face during a fight with a woman at a Times Square...

Christian McCaffrey’s season over with ankle injury

Panthers running back Christian McCaffrey will miss the remainder of the season after suffering an ankle injury in...

These secret iPhone codes unlock cool effects in iMessage

You can add colorful animations to your iPhone texts using secret codes that few people know about. All you...


Facebook in 2019 began classifying countries in a “tier” system that dictated the amount of resources used to moderate content posted on the platform in those nations, according to a new report.

The prioritizing system placed the United States, Brazil and India in “tier zero” — meaning more resources were dedicated to moderating content and enforcing Facebook’s regulations in those countries, The Verge reported Monday.

Israel, Germany, Indonesia and Iran were categorized in “tier one,” so Facebook committed slightly less to enforcement of its rules and election integrity safeguards, according to the technology news outlet. Facebook reportedly formed “war rooms” for content moderation in those countries, including on elections.

Twenty-two countries were placed in “tier one” but did not have dedicated “war rooms,” and Facebook put the rest of the globe in “tier three,” which meant the platform only took action on rule-violating election-related content if it was flagged by moderators.

The logo for Facebook appears on screens at the Nasdaq MarketSite in New York's Times Square.
Facebook targets the US, Brazil and India as “tier zero” for moderating the most content on its platforms, according to The Verge.
AP

The tiered system is detailed in disclosures to the Securities and Exchange Commission and a redacted version was given to Congress by whistleblower Frances Haugen’s lawyers.

The documents show that there are huge differences in the guardrails Facebook employs to monitor content in countries. Artificial intelligence that detects hate speech and misinformation in the United States is not available in Ethiopia, The Verge noted.

Frances Haugen, former Facebook employee turned whistleblower, arrives to testify before a Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security hearing to examine Facebook's practices, on Capitol Hill on October 05, 2021 in Washington, DC.
Facebook whistleblower Frances Haugen has accused the Big Tech giant of letting online hate and extremism grow.
Getty Images

Facebook also does not have a misinformation detection system in Myanmar, where the military seized power earlier this year via a coup, or Pakistan.

Facebook Chairman and CEO Mark Zuckerberg consults with his legal team as he testifies at a House Financial Services Committee hearing in Washington, U.S., October 23, 2019.
Facebook Chairman and CEO Mark Zuckerberg has been criticized for meddling in the 2020 presidential election to prevent former President Donald Trump from being re-elected.
REUTERS

The revelation of Facebook’s opaque content moderation practices comes after a new whistleblower told the SEC that top Facebook staff undermined attempts to fight misinformation and hate speech during President Donald Trump’s tenure because they predicted those efforts would hamper the company’s growth and feared Trump and his allies.



Source link

- Advertisement -
- Advertisement -
Latest News

NYC man slashed at Times Square subway station

A homeless man was slashed in the face during a fight with a woman at a Times Square...
- Advertisement -

More Articles Like This

- Advertisement -