Facebook mentioned that they have removed 583 million fake accounts

Facebook logo
Last month, Facebook released its content regulation guide for the first time. Now, the social media company is providing the first report regarding the actions they take on illegal content. This report may be released quarterly.

In the report, Facebook discusses the content and accounts they have blocked related to six different issues of explicit violence, sexual activity, and nudity, terrorist propaganda, hate speech, spam as well as fake accounts.

Through the report, Facebook explains how the content is viewed by its users, how much content has been removed and how much content is deleted before it is reported by a Facebook user.

In this quarter, spam and fake accounts are the biggest issues facing Facebook. Facebook removed 837 million spam content and 583 million fake accounts.

They have also taken action against 21 million sexually-related content, 3.5 million violent content, 2.5 million hate speech content and 1.9 million terrorist content.

In some cases, the Facebook algorithm system can find and mark problematic content before users report it.

Their system has found nearly 100 percent spam content and terrorist propaganda, almost 99 percent of fake accounts and about 96 percent post with sexual activity.

While the level of accuracy of the Facebook system to recognize explicit violence reached 86 percent. Unfortunately, Facebook is still difficult to mark the content of hate speech. Their system can only mark about 38 percent of the post they ultimately follow.

"As Mark Zuckerberg said in F8, we still have many tasks to stop illegal content," said VP of Product Management, Guy Rosen in a blog post.

"One reason is that even though technology such as artificial intelligence is promising, it still needs to be developed to be able to recognize illegal content because the context has an important role."


No comments