“>

The auditor said: This is a form of torture for every viewer.

02 Who is doing this job?

Every minute, YouTube has 500 hours of video, Facebook has 2.5 million messages, and Twitter has 450,000 tweets uploaded.

In the ocean-like flow of information, terrorism, murder, self-harm, pornography, extreme organizational speech, and political propaganda are mixed in the cat’s cat and food photos. Who is responsible for cleaning up the spam?

1) Labor-intensive industries

According to incomplete statistics, the content review staff of major technology companies are from third world countries or underdeveloped regions. Among them, the number of outsourced content examiners in the Philippines exceeded 100,000.

Technologies such as YouTube, Google, and Facebook will have a dual review system, with first-level reviews through artificial intelligence and algorithms, followed by manual filtering through outsourcing companies based in Southeast Asia.

As a large labor-intensive industry, most people are bound by confidentiality agreements, and their work faces the dark corners of the Internet: filtering violence, pornography, and horror content. They are called “network scavengers.”

Before becoming an examiner, these people need three to five days of induction training, such as remembering 37 terrorist organizations – their flags, uniforms, slogans.

The most

Through such training, the examiner went to work and became an ethical angel. Through mechanical and hard work, people believe in the beautiful imagination of society.

In training, the outsourcing company will tell the auditors what they will see, what tools are needed, and what the consequences will be.

A lot of auditors will resign, and the team leader will persuade them and tell them that they have signed a confidentiality agreement.

The most

In the confidentiality agreement