Facebook has commissioned outsourcing companies to hire 15,000 people worldwide to work on platform user content review and deletion.

Editor’s note: This article is from Tencent Technology, reviewing the commitment.

US social networking giant Facebook, because its illegal and harmful content on the platform has been criticized by public opinion, as a measure to deal with external pressure, Facebook regularly publishes reports on platform content review. According to the latest news from foreign media, on November 13, Facebook announced a latest platform content review report, which reportedly deleted 3.2 billion fake accounts and millions of descriptions between April and September this year. Violation of user posts for child abuse and suicide.

Facebook Release Transparency Report: Delete 3.2 billion fake accounts in the past six months

According to foreign media reports, according to the report, this is more than double the number of fake accounts deleted in the same period last year, when 1.55 billion fake accounts were deleted.

This world’s largest social network also revealed for the first time how many posts it has deleted from popular photo-sharing application Instagram, which has been identified by false information researchers as an increasingly worryingly false news. A flood of land (more than one billion people worldwide on Instagram).

The company said in its fourth content review report in its history that on Instagram, Facebook found the content of violating content less frequently than the Facebook app. On the Facebook platform, the company has implemented many detection tools, including Includes many artificial intelligence-based scanning monitoring technologies.

For example, the company said that on the scan of terrorism content, the Facebook platform will find it about 98.5% of the time, while the time on the Instagram platform is about 92.2%.

In the third quarter, the company deleted more than 11.6 million items describing child nudity and child sexual abuse on the Facebook platform, and deleted 754,000 user posts on Instagram.

However, Facebook is now beginning to transform social products from Faceb, which shares information publicly.The ook platform model turns to the private mobile chat mode represented by Messenger and WhatsApp, and Facebook is considering expanding the encryption of information so that other organizations or bad people can’t steal the chat between users.

It is reported that US law enforcement agencies are concerned that Facebook’s plans to provide greater privacy protection through more chat encryption services will hinder efforts to combat child abuse.

Last month, FBI Director Christopher Rey said the changes would turn Facebook’s social tools into a platform for sexual assault criminals and child pornography dreams.

Facebook also added data on actions taken on self-harm content for the first time in this transparency report. The company said it had removed about 2.5 million user posts describing or encouraging suicide or self-harm in the third quarter.

The company said in an official blog that about 4.4 million user content related to drug sales were deleted in the third quarter.

On Wednesday night, Facebook shares fell.

In the past two years, Facebook’s public image has plummeted, in addition to breaking news of countless violations of consumer privacy (especially the most representative “Cambridge Analytical Inc.”), another The important reason is the lack of effective supervision of the content of violations, which leads to the spread of negative content, exacerbate social tears, and affect the physical and mental health of young users.

For example, at the beginning of this year, a large-scale shooting incident occurred in a religious site in New Zealand, which caused dozens of people to die. The gunmen broadcasted a crazy killing process through Facebook’s live video service. After the incident, there were millions of related incidents. The video of the shooting was spread on the Facebook platform, and the outside world accused Facebook and Google’s YouTube of not being able to delete videos or user posts in time, which exacerbated the victims’ suffering.

It has been reported that Facebook has commissioned outsourcing companies to employ 15,000 people worldwide to review and delete platform user content, with 10% of its staff in India.