Accessibility links

Breaking News

Facebook Removes Millions of Posts and Billions of Fake Accounts


FILE - An iPhone displays the apps for Facebook and Messenger in New Orleans, Aug. 11, 2019.
FILE - An iPhone displays the apps for Facebook and Messenger in New Orleans, Aug. 11, 2019.

Facebook released its Community Standards Enforcement Report on Wednesday, detailing its work in regulating its main app and Instagram from terrorist groups to child porn.

The company said it removed more than 3.2 billion fake accounts between April and September, compared with more than 1.5 billion during the same period last year. The company also said it removed 11.4 million pieces of hate speech, compared to 5.4 million in the same six-month period in 2018.

Instagram

For the first time, Facebook included Instagram in the report. The company said it made progress in detecting child nudity and sexual exploitation, removing more than 1.2 million pieces of content between April and September.

Instagram spokesperson Stephanie Otway told VOA that Instagram previously had different ways of measuring enforcement on their community standards policies.

"We brought our methodology in line with Facebook and that alignment meant we were able to share metrics for the first time today," Otway said.

Facebook said it had proactively deleted up to 98% of posts that it recognized as terrorist propaganda in the past two quarters. This included major organizations like Islamic State and al-Qaida and smaller, regional terrorist groups.

Messaging services

Law enforcement officials are concerned that Facebook's plans to provide greater privacy to users by encrypting the company's messaging services (including Facebook Messenger and WhatsApp) will obstruct efforts to fight child abuse.

Last month, FBI Director Christopher Wray said the changes would turn the platforms into a "dream come true for predators and child pornographers."

Facebook said its official policy on child pornography is to remove the content "regardless of the context or the person's motivation for sharing it."

Posts that violated Facebook's policies were deleted before many people were able to view them. Facebook estimated that for every 10,000 views on Facebook and Instagram, only four views contained content that violated their policy.

Proactive detection of violating content was lower across all categories on Instagram than on Facebook's main app.

Facebook's apps have a combined total of billions of users across the world that use the apps at least once a day.

XS
SM
MD
LG