This article comes from the public media (ID: quanmeipai), and Ai Faner is authorized to publish.

A new study by the Pew Research Center found that in Facebook’s news photos, men appear twice as often as women, and most of the pictures are about men.

Considering that 43% of US adult citizens currently access news primarily through Facebook, the Pew Research Center uses machine vision to test news photos posted on Facebook by 17 national news outlets from April to June 2018. Gender ratio situation. The test algorithm finally identified 53,067 people, of which 33% were women and 67% were men. The gap was wide. But in real life, the sex ratio of the US population is roughly balanced.

So, who is “distorting” the two sexes?

The research of the current media group (ID:quanmeipai) and the Pew Research Center and the MIT media lab scientist Joy Buolamwini found that the proportion imbalance between the two sexes approaches the face recognition, and explores together: why sometimes, the algorithm In the eyes, are you in a fuzzy area where you can be a man? Further, what are the prejudices outside of gender? What can we do to deal with this situation?

Gender imbalance under face recognition

Pui’s report points out that in different types of news reports on Facebook, women’s “presence” in pictures is always lower than men’s. In the economic-related posts, only 9% of the images are purely female, in contrast to pure male images, which account for 69%. Women have more opportunities to showcase in entertainment news images, but overall they are still lower than men.

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

You may be confused about the scarcity of women, which is related to a larger social reality to some extent. For example, in the news reports about professional football teams, most of the images identified are male; in reports against the US Senate and the House of Representatives (25% of women), the number of female faces identified is certainly less than that of men. Much more.

Aside from the smaller details of these granularity, this study still reveals some alarming status: in Facebook news images, men are more prominent than women; in groups of two or more people Men tend to be more than women. At the same time, men will occupy a larger visual space.

Behind the gender bias of face recognition, is the algorithm blind spot or the heart?

The researchers also measured the size of the female face and the male face in the image (current technology can only capture the size of the face, ignoring the influence of hair, jewelry and headwear). The results showed that the average face area of ​​the male face was larger, and this difference caused the average facial size of the male in the image to be 10% larger than that of the female. In the image of Facebook, this shows that male characters can bring greater visual impact to readers.

Specifically, in economically relevant posts, the average size of female faces is 19% smaller than that of men, but in entertainment-related content, the average size of female faces is 7% larger than that of men.

Behind the gender bias of face recognition, is the algorithm blind spot or the heart?

A machine vision tool like facial recognition is being used more and more widely in law enforcement, advertising, and other fields. Gender recognition is one of its basic functions.

In real life, recognizing the gender of the people around you is simple, but for a computer, what steps does it take to work?

How does the computer “see” your gender?

“After feeding the thousands of image cases to the algorithm, as a ‘mature algorithm’, the facial recognition system itself can learn how to distinguish between men and women.” This answer can explain the above. Question, but for us outside the “black box”, it may not be easy to understand the learning process.

To better understand the rules in this process, the Pew Research Center conducted an interesting experiment in which they uploaded images of their center staff to the machine vision system and partially occlude the image content, hoping to Finding the law and finding out which facial areas will make the algorithm make or change the decision.

In this interactive challenge of “human-machine game”, you may also boldly guess which parts affect the judgment of the system?

First, enter a clear picture into the machine vision system. At this point, both the algorithm and you can clearly determine the gender of the person in the photo.

face recognitionBehind the gender bias of skin color, is the blind spot of the algorithm or the speculation?

Next, there are a number of boxes in the photo that tell you, “Selecting a box means hiding the content in the image, your choice may affect gender judgment.”

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

Finally, when you complete your selection, the image will show all areas that affect the gender classification change.

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

Interested readers can visit the Pew Research Center website to complete this small experiment. Portal:
https://www.pewresearch.org/interactives/how-does-a-computer-see-gender/

The following set of images is part of the results of the interactive experiment. When you select a purple or yellow area of ​​the picture, it will bring about a decision change in the recognition system. In the current gender diversification, in real life, gender recognition is not easy, but Pew shows through this experiment that it is clearer that in the algorithm system, let the machine firmly and surely say the gender of the testee. too difficult.

face recognitionBehind the gender bias of skin color, is the blind spot of the algorithm or the speculation?

After looking at this picture, what else can you find? – Sometimes, the part of the face that causes the model to change is perhaps quite different from what we expected. For example, in the fourth picture, covering people’s faces will cause system recognition to change, but more often, the “interference zone” that causes the algorithm to judge the opposite is actually the face edge, hair root, mouth corner and other areas.

From these experimental cases, you may also find that there is no uniform, stable law that can explain this phenomenon. Sometimes, hiding the middle of a tested face can cause a change in gender recognition, but covering the other in the same way does not necessarily result in the same result.

Machine learning can greatly improve the efficiency of our data processing, but unlike traditional computer programs, machine learning follows a series of rigorous steps, and their decision-making methods are largely invisible and highly dependent. Used to train your own data. These characteristics may result in machine learning tools producing systematic deviations that are more difficult to understand and predict in advance.

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

From this perspective, the Pew Research Center used a simplified experiment to show how the data used to train the algorithm introduces hidden biases and unexpected errors into the system results. Researchers say that as algorithms are playing an increasingly important role in decision-making in human society, it is important to understand their limitations and biases.

What does “prejudice” bring?

Recently, 26 of the top AI researchers, including Turing Award winner Yoshua Bengio, asked Amazon to stop selling its artificial intelligence service Amazon Rekognition to the police in a public blog post. Anima Anandkumar, former chief scientist of Amazon’s cloud computing division, and others also joined the joint appeal.

Before, Deborah Raji, a researcher at the University of Toronto, and Joy Buolamwini, a researcher at the MIT Media Lab, wrote a study that pointed out that Amazon’s Rekognition is more effective than judging skin color when detecting female genders with darker skin in images. The error rate of the lighter male gender is much higher. The research results have also been supported by scholars, but Amazon has written this for two people.The report and research methods have raised objections.

Behind the gender bias of face recognition, is the blind spot of the algorithm or the speculation?

▲The accuracy of Amazon facial recognition system for different skin color and gender

Joy Buolamwini led an AI research project called Gender Shades. After studying the facial recognition systems of leading technology companies, it was found that all systems performed better on identifying male faces, and all systems were recognizing light faces. The accuracy on the hole is higher. The average recognition error rate for dark-skinned women is as high as 35%, for dark-skinned men, 12% for light-skinned women, and 7% for light-skinned women, and for light-skinned men, the error rate is less than 1%.

What might the “prejudice” of the facial recognition system bring?

Behind the gender bias of face recognition, is the blind spot of the algorithm or the speculation?

▲Google recognizes this user’s friend as a “gorilla”

“Face recognition technology can be abused regardless of its correctness,” Joy said. Accurate or inaccurate use of facial recognition technology to analyze the identity, face, and gender of others may violate the freedom of others. For example, inaccurate identification may make innocent people obscured and unreasonably censored by law enforcement officials. This is not a hypothetical situation. The British non-profit organization Big Brother Watch UK has published a report emphasizing that the face recognition technology used by the London Police Department has a gender recognition error rate of over 90%. Last summer, the British media reported such a news that a young black man was mistaken for a suspect because of a facial recognition technique and was searched by the police in full view.

A leaked report also shows that IBM has provided law enforcement agencies withTechnology that searches for people in the video based on hair color, skin tone, and facial features. The news raised concerns that the police will use the technology to focus on specific races.

Behind the gender bias of face recognition, is the blind spot of the algorithm or the speculation?

▲Ida B.Wells, a well-known African-American journalist and affirmative sportsman, is recognized as a male.

In order to reduce the time required to search for faces, law enforcement is using a large number of gender classifications. If the gender of the matching face is known, a simple dichotomy can greatly reduce the number of potential matches that need to be processed. Gender classification is being widely applied to police activities.

When these biased identification systems are widely used in social life, they can lead to even worse consequences.

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart of the speculation?

▲Joy Buolamwini presents a speech entitled “How I’m fighting bias in algorithms” on TED

In the TED talk, Joy shared a little story with everyone:

Under the same lighting conditions, the facial recognition system can only detect participants with light skin color; only dark masks can be detected by wearing a white mask. “The most basic premise before the artificial intelligence tool determines the identity of the face or identifies the expression information is that the face is detected. However, the facial recognition system fails repeatedly in detecting the black skin individual. I can only comfort myself, the algorithm is not The racist, his face is too dark,” Joy said.

Where does the deviation come from?

If you compare the accuracy of the developer’s own statements with the research findings of the researchers, you will find an interesting thing: the data released by the company and the independentThe external accuracy of the three parties is always different. So, what caused this difference?

Joy reminds us to focus on the bias of the benchmark dataset. “When we discuss the accuracy of facial analysis techniques, it is done through a series of image or video tests. These image data form a benchmark, but not all benchmarks are equal.”

The relevant person in charge of Amazon said that the company used more than 1 million face data as a benchmark to test the accuracy of the product. But don’t be confused by this seemingly large sample. “Because we don’t know the detailed demographic data of the benchmark data. Without this information, we can’t judge whether the choice of benchmarks may have the possibility of prejudice such as race, gender or skin color.”

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

▲Different systems have different identification data for dark-skinned actors

Facebook has announced that its face recognition system has an accuracy rate of 97% in a data set test called Labeled Faces in the Wild (LFW, one of the world’s best-known data sets for face recognition). But when the researchers looked at the so-called gold standard dataset, they found that there were nearly 77% of men in the data set, and more than 80% were white.

In order to eliminate bias as much as possible at the data layer, Joy suggests that a more inclusive benchmark data set should be built. To balance the benchmark data, she lists the top 10 countries in the world in terms of women’s parliament, with Rwanda leading the world by more than 60% of women. Taking into account the typical representation of the Nordic countries and a few African countries, Joy selected three African countries and three Nordic countries to balance the types of skin in the dataset by selecting individual, dark-skinned individual data from these countries.

Based on this more balanced data set, they re-evaluated facial recognition systems from companies such as Amazon, Kairos, IBM, and Face++. In the August 2018 study, they found that Amazon and Kairos performed well in white male identification, but Amazon’s accuracy in identifying female faces of colored people was very low, at 68.6%.

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

▲Amazon’s facial recognition system marks the image of Oprah Winfrey with a male label and gives confidence in the data

Joy said that face recognition in the real world is more complicated and difficult than experimental detection. The benchmark data sets they created are not fully tested. “But it is like running games, in benchmarking. Excellent performance, at least you can guarantee that you will not fall off when you start.”

Even with the same benchmark, the accuracy number of the facial recognition system may change. Artificial intelligence is not perfect. In this case, it is a useful practice to provide users with more specific judgment information by providing confidence.

Face behind the gender bias of face recognition, is the algorithm blind spot or the heart?

Face recognition technology has been widely used in large-scale surveillance, artificial intelligence weaponization and more law enforcement environments. However, this powerful technology is rapidly evolving without adequate supervision.

In order to reduce the abuse of facial recognition technology, the Algorithmic Justice League and the Center on Privacy & Technology launched the Safe Face Pledge campaign. At present, many technology companies, including Amazon, have not yet joined this commitment. “According to our research, it would be irresponsible to sell facial recognition systems to law enforcement or government agencies.” Joy, one of the founders of the Algorithmic Justice Alliance, hopes that more organizations will join the “safe face” in the future. Commitment, able to act responsibly and morally for the development of facial analysis technology.

After all, behind the algorithmic bias is actually our own human prejudice.

Reference link:

1.https://www.journalism.org/2019/05/23/men-appear-twice-as-often-as-women-in-news -photos-on-facebook/

2.https://www.pewresearch.org/interactives/how-does-a-computer-see-gender/

3.https://medium.com/@Joy.Buolamwini/response-racial-and-gender-bias-in-amazon-rekognition-commercial-ai- System-for-analyzing-faces-a289222eeced