Facebook removed 2.2 billion fake accounts from its platform during the first quarter of the year — nearly double the number it took action on in the prior quarter. The company says the increase is due to an uptick in automated attacks that create many accounts at once.
The company released the latest iteration of its Community Standards Report, which is meant to help the public understand how it is handling content moderation.
It’s one of a number of efforts Facebook has made at increasing transparency and improving its public image after a cascade of scandals, including Cambridge Analytica, Russian interference in the 2016 election, and its platform’s role in spreading misinformation.
Its practices have been under increased scrutiny around the globe, including from politicians, some of whom have called for Facebook’s break-up and questioned how it moderates what is and isn’t allowed on its platform.
In the new report, Facebook revealed that it took down nearly as many fake Facebook accounts as there are real ones in the first three months of 2019.
It’s a huge jump. In the fourth quarter of 2018, Facebook took down 1.2 billion fake accounts, and the quarter before, 750,000. During the first quarter of 2018, Facebook took down fewer than 600,000 fake accounts.