This article is from WeChat public account: AIPharos 月光 社 (AI-Pharos) , author: Zhaojia Peng, Photo by Josh Kahen on Unsplash

On Christmas Eve in 1816, the poet Byron sent a letter to a close friend.

The letter is accompanied by a poem written by him, entitled “The Song of Lutherans,” which later became one of Byron’s masterpieces. The poem reads: Let’s replace the shuttle with a sword.

Byron is one of the sympathizers of the Lutheran movement. After the Industrial Revolution in Britain, the widespread use of machines replaced human labor, and unemployed textile workers launched violent resistance against it, known as the “Lutheran Movement”. Lord Byron believes that “Lutherans” are heroes of resistance to power and extols their efforts to destroy the textile machine.

In the same year that the poem was posted, a family accident happened in Byron’s house. The poet valued men more than girls, and longed for men, but his wife gave him a baby girl. The disappointed poet divorced his wife, and the baby girl just after the full moon left her father.

After growing up, the baby girl picked up the shuttle weaved by Byron and became the first computer programmer to combine analysis engine technology with a jacquard loom. Her name is Edda Loveless and is considered by many to be the first computer programmer in history. Starting with her and her contemporaries, a wave of newer technologies will sweep through.

The earliest computers in the world were conceived and designed by the British. British mathematician Charles Babbage conceived and designed the first fully programmable computer in 1820. Edda Loveles wrote the first algorithm for Charles Babbage’s machine.

Why is Edda Loveless, a woman who completes the mostEarly programming work?

Computing and programming were considered to be women’s jobs before computers became electronic technology. In the opinion of the arrogant gentlemen at the time, it did not require manual labor, but rote memorization. Even if it required advanced mathematical knowledge many times, it was regarded as “non-intelligent labor”. British philosopher Sadie Plant’s (Sadie Plant) research believes that early work related to computational science has a certain degree of “knitting” Similarity, “It is the integration of multiple threads into one piece of cloth through a complex process.” In other words, if women can operate a textile machine, why not a computer?

This stereotype brought the earliest workforce to the development of computational science in Britain: the female programmer community. They are responsible for operating, programming, troubleshooting and assembling new machines. During World War II, women soldiers of the British Royal Navy even assembled the world’s first electronic digital programmable computer: Giant Computer (Colossus Computer) . This machine was used to decipher Nazi German codes and contributed to determining the exact time of Normandy’s landing.

Eda Loveless is undoubtedly a female pioneer in computing science. To commemorate her, since the 1980s, many countries and organizations have established awards and institutes named after them. A British journalist also founded “Eda Loveless Day” to promote women Image on Science, Technology, Engineering and Mathematics (STEM) .

Every second Tuesday of October, the “Eda Loveless Day” party will be held as scheduled. One theme of the commemorative event was how to improve the status of women in the field of computational science.

This is a fact we have to face today.

Discovery from MIT

In the past three decades, computing science has experienced a climax of development and the rise of the Internet and artificial intelligence. However, the situation of women who remain in the industry is becoming more difficult.

The Canadian girl, Joe Bravini, reminds us of this.

Usually she likes to wear colorful thick-rimmed eyesMirror, leaving a fluffy, curly black hair. The former is her fashion label, while the latter reveals another identity label: a dark-skinned girl of Ghanaian descent.

Joe is a genius girl. As a teenager, she showed a unique talent in computer science, and later studied in many prestigious schools. At the MIT Media Lab, Joe is involved in an interactive art installation. This device recognizes head movements and facial expressions, generates a digital mask, and projects it onto the screen.

Joe suffers setbacks. In the study, her face was not recognized. She and her colleagues found that the device, which uses universal facial recognition software, only works well in front of lighter-skinned faces. Faces of people of color, especially women of color, always seem indifferent.

Is the problem with the program?

Joe and his partners are conducting further research. They used photos from Oprah Winfrey, Michelle Obama, and others to test on multiple facial analysis programs. It was found that these programs all suffered from different levels of recognition errors: the aforementioned outstanding black women were all judged male.

In 2018, Joe’s research team published a paper titled “Gender Shadows” that questioned and discussed the gender and skin color bias of commercial face recognition technology. The paper examined three popular commercial facial analysis programs and found that these programs have an error rate of not less than 0.8% when determining the gender of light-skinned men, but the error rate of the two programs exceeds 34 when identifying dark-skinned women. %, One more than 20%, can hardly be identified correctly.

It’s not just a question of program or algorithm deviation. Joe argues that algorithms reflect the prejudices of those who have the power to shape technology. Women, especially dark-skinned women, are experiencing “algorithmic gaze.”

Even white women can hardly escape the alternative “algorithmic gaze.”

In an algorithm test by Microsoft, researchers tried to make the algorithm inferred from the LinkedIn information of Melinda Gates’ (Melinda Gates) Its occupation. It turns out that personal pronouns of different genders can bring completely different results: when Melinda Gates is “her”, her profession is a teacher; when “He” is replaced, the Microsoft founder Bill Gates His wife is considered a lawyer.

Common commercial algorithms are probably the most powerfulOne of the commodities of global nature, it will also be the information infrastructure of the future. Will it persist in real-world sexism?

We have learned a lot before. Although female scientists account for a lot in reality, in Wikipedia, only 18% of scientist biographies pages are reserved for women. The same is true of the film industry. Since 1946, female characters have only accounted for 17% of the collective scenes of all-age films.

This year, UNESCO published a book titled “Span class =” text-remarks “label =” Remarks “> (I’d blush if I could) “Report. Researchers have found that the popular smart voice assistants on the market are trying to pass on gender discrimination to a new generation of young people by creating a “docile and helpful” assistant model. The report also predicts that by 2021, the global use of smart voice assistants will reach 1.8 billion. If the situation does not improve, the impact will be huge.

Why does the algorithm generate gender discrimination?

The bias of the training data set is one of the reasons. The algorithm requires a large, labeled data set for training. During the collection and labeling process, data samples will be the first to bring tendencies and prejudices. For example, the ImageNet dataset containing more than 14 million labeled images used in computer vision research, of which 45% of the data comes from the United States, and China and India together account for only 3% of the data. This caused the “gender shadow” problem that Joe found, and fewer dark-skinned women were used for training, making it difficult to identify.

Bias from previous cultures is also misleading the training dataset. Shen Xiangyang, who is responsible for Microsoft’s artificial intelligence research and development, believes that the Internet’s open news and webpage data constitute the benchmark data set for training algorithms, and these data itself already contain the Internet’s existing gender bias: “sassy

Another source of gender bias is the algorithm itself. In general, if a certain type of population appears in the training data set, the algorithm program will optimize for this type in order to improve the overall prediction accuracy. This means that, without interference, the algorithm will reinforce the bias that the data set brings.

In 2018, two professors at Stanford University pointed out that when translating from Spanish to English, Google Translate will use male pronouns by default. Afterwards, Google fixed the defect. This is a case of algorithmic bias. In today’s English corpora, the ratio of male to female pronouns is 2: 1.

In the 1960s, this ratio was as high as 4: 1, but it was reduced to the current level due to social movements such as gender equality. Some people worry that non-interference algorithms will undermine this hard-won achievement.

All-time lows in diversity

Donna Haraway had high hopes for new technologies.

In 1985, she published the Cyberborg Declaration, calling on people to use technology to shape a new gender identity. This optimistic text believes that the future will be an era dominated by artificial intelligence, and new technologies will allow women to escape the constraints of traditional gender roles.

Reality has disappointed the technical philosopher. Technological advances have not made ancient prejudices disappear. It even had a terrible effect: in the field of computing science, for example, a wide range of gender bias transmissions occurred, and the proportion of women in the industry became less.

The ILO estimates that women accounted for approximately 46.9% of the US labor force in 2018. According to data released by Apple, Google, Facebook and Amazon, women account for less than 30% of the company’s technology workers. There is a serious difference between the two.

In the forefront of computational science, the under-proportion of the sex population is even more serious. A research report this year pointed out that in Facebook and Google, female artificial intelligence researchers accounted for only 15% and 10% respectively; in the field of artificial intelligence academics, female authors only have 18%.

Other statistics show that in 2018, only 22% of global artificial intelligence professionals were women. Women practitioners mostly worked in application fields and were less involved in technology development. Different positions mean a pay gap. Even for the same technical positions, women tend to have lower incomes than men.In the survey of women in the field, women are paid 28% less than men in the same position.

What is the impact of the disappearance of women in the workplace of computing science?

Microsoft interviewed 1,500 girls and young women across Europe. The results show that from the age of fifteen, girls ’interest in STEM Yes, there is a lack of female role models in this field.

The lack of interest has led to fewer women entering these fields. A report released by Stanford University showed that the school’s male introductory courses in artificial intelligence in 2017 were 74%, and the male introductory machine learning course was 76%. For comparison, the proportion of males in the two courses at UC Berkeley is 73% and 79%.

This means that only one in five students participating in AI-related learning is female.

This is an ongoing recession.

As recently as the 1985 Declaration of Cyberberg, the percentage of women enrolled in computer science in the United States was 37%. Thirty years later, this proportion has dropped to 18%. Considering that most practitioners in this field need a professional background, the shrinking female student group has caused a “loss of diversity” in this field. Some observers believe that we are at a historically low point of diversity today. .

Melinda Gates is frustrated by this. The charitable activist who has encountered “algorithmic gaze” found that only about 5% of previous technology company efforts in philanthropy and social responsibility have benefited women in the technology industry, of which only 0.1% were women of color.

The underlying accusation behind these numbers is that tech giants don’t seem to have done much to that end.

So how do you increase women ’s opportunities in the field of computing science?

In 2015, Li Feifei, director of the artificial intelligence laboratory at Stanford University, and others initiated a project called “AI4ALL”. This is an artificial intelligence experience and education project for students. The main target is women, some people of color and low-income families.

The nightmare didn’t wake up

Computer female historian Mary Hicks believes that technology is a tool for exercising power in people. In general, those who have the power are those who have the least knowledge of the flaws in our current system, and who determine the future of our technology.

Discrimination and prejudice against women is one of many atrocities of technological power. It has not improved due to technological advances, nor has it all improved due to different sources of power.

After the mid-1980s, the rise of personal computers brought the popularity of computing science. More young people are entering the field of computing science, including women. At the same time, however, women have fallen from year to year in the sex ratio of computer majors. Taking the United States as an example, this proportion has dropped from 37% in 1984 to about 18% today.

Why is this happening?

Some studies have shown that the marketing goals of early personal computers were mainly concentrated on men and boys, not only ignoring women, but even through television and other media forms, and constantly conveying the idea that women are not suitable for this novel machine. This has led to a decline in women’s interest in computing science in the age of consumption. It was also from then on that the image of programmer or hacker became “male nerd”.

With the global expansion of the entertainment industry, stereotyped gender impressions have even been strengthened and developed in the era of artificial intelligence.

In Hollywood scripts, programmers are mostly males who are obsessed with technology, and male robots are also mostly taking on more exploratory tasks: exploring space, helping humans, and conducting scientific research; women are mostly in auxiliary roles. Female robots are mostly the sexy, charming and obedient images in “Blade Runner”. Even once female robots begin to have autonomous thinking, they will become nightmarish killing characters in “Western World”.

A study by Nanyang Technological University in Singapore coincides with Hollywood. The researchers measured human psychological preferences for different robots to complete work, and found that people are more inclined to female robots to complete housekeeping tasks, while male robots are expected to complete security work.

“Men are good at conquering, women are good at guarding. Everyone is afraid of machines. No one wants to be conquered by machine intelligence, but everyone is willing to be guarded by gentle machine intelligence.” This is Ma Yun