That is the social platform published today in a report that describes how it carries out its policy. This is the second time this year that Facebook is open to these numbers. That was done in May.
The company holds a number of statistics, from interconnected messages where violence is boosted to the amount of false accounts that have been deleted. For example, Facebook interrupted between 65.6 million Facebook jobs that include nude images between March and the end of October, and 9 million jobs with sexual exploitation or images of naked children.
In addition, Facebook came into force with 2 million bullying messages, eliminating 2.1 billion spam messages, and 12 million jobs with terrorist content.
The timing is not wrong, because Facebook is being attacked again. This time because, according to reports in the New York Times, a presidential PR agency failed to criticize critics and criticize the social network.
The New York Times message describes how people at the top of the company are trying to get the bad news. But it also shows how the public relations agency was trying to encourage journalists to investigate the actions of Google and Apple competitors, in order to divert the attention of the problems on Facebook.
That would have happened when the company came into the news two years ago after the elections, because there was fake news and suspicions of Russian interference through Facebook.
After publishing statistics, CEO Mark Zuckberg announced a blog post where he describes the precise what he thinks is the most important challenges for me. Facebook and how the company wants to go.
It says, among other things, stimulating and sensible messages the biggest challenge for social networks, including Facebook. According to him, it can lead to the quality of Facebook's suffering services.
He also wants a company to focus on chocolate and fake news. Facebook users would complain that both of them made the social network less enjoyable.
In order to limit the negative influence, artificial intelligence plays an increasingly important role, says Zuckerberg in an explanation of the fight against misunderstanding. Do not replace the people who need to review content now, but to make their work easier. For example, artificial intelligence must check large-scale messages if they contain more simple offenses of the conditions of use.
More complex cases are still being presented to the content moderators that pass the final view on this. That is, Zuckerberg says, because even the most expensive systems can not do so alone.
Facebook founder also announced an independent appeals body for users who disagree with a decision by the appraisers when they feel that an offline message should be taken.