Illustration by James Bareham / The Verge
Facebook on Monday released a new report detailing how it uses a combination of artificial intelligence and human fact-checkers and moderators to enforce its community standards. The report — called the Community Standards Enforcement Report, which usually encompasses data and findings from the prior three months — has a large focus on AI this time around.
That’s because Facebook is relying more on the technology to help moderate its platform during the COVID-19 pandemic, which is preventing the company from using its usual third-party moderator firms because those firms’ employees are not allowed to access sensitive Facebook data from home computers. Given the state of the world, Facebook’s report also contains new information about…tech, The Verge, The Verge - All Posts