blockquote>

It is so easy to apply for a job at Amazon, but this experience made me face the reality of “automation” taking over the world for the first time in my life.

A few days later, I passed the drug test and filled out the background check form. In just two weeks, I started working at Amazon. When I reported to the Amazon warehouse with my identification documents, I finally met two living people (there are finally people who can communicate). I stayed in the recruitment office for less than 5 minutes and the procedure was completed.

“That’s it?” I asked.

“That’s it.” The man said.

Although I got the job soon, my feelings about automating the application workflow are complicated. I found a job so quickly, and I didn’t talk to anyone in the process, which made me feel very uneasy. Do they (the company) really don’t care about any humanistic qualities of the applicant? Do they care about the quality of my work and my actual work ability? Can they understand this by meeting each other?

Newcomers need to watch a one-hour training video on entry. The content of the video includes everything from how to dress properly to safety precautions, and some quizzes. As I write this article, I just finished the first week of 40 hours of work. As a warehouse picker, I like this job very much. I take each order package off the shelf and put them outside the warehouse for delivery. The biggest lesson I learned from working at Amazon so far is: If you buy clothes from Amazon or any other online retailer, be sure to wash your clothes before putting them on, because these clothes pass by many people before you.

Like Amazon recruiting hourly warehouse workers, automating the recruitment process can save companies a lot of time, especially for large companies (as of April 30, Amazon’s number of employees has exceeded 935,000). And some people believe that an automated recruitment process can also help eliminate bias. Frida Polli wrote in the “Harvard Business Review”: If there is no automated recruitment, many job applicants will not even have the opportunity to interview, but artificial intelligence can comprehensively evaluate candidates throughout the recruitment process. In addition, proper review can also eliminate people’s unconscious bias.

However, according to Cornell University professor Manish Raghavan and Brookings Institution’s Solon Barocas, automating the recruitment process as an anti-bias intervention can sometimes exacerbate bias. In the Social Science Research Network (SSRN), Professor of Law Ifeoma Ajunwa pointed out that compared with the traditional way of filling out the job application form and handing it to the hiring manager,The tools required for candidates to use in the automated recruitment process may be more restrictive, especially for low-paying jobs and hourly workers.

In fact, Amazon has found biases in its automated recruitment tools. In 2018, the company’s machine learning experts discovered that their recruitment engine favored male candidates. Amazon built a model to train computers to review applicants by observing the resume data submitted to the company during the past 10 years. In the past 10 years, most of the resumes came from men. As a result, the review tool learned to filter resumes that contained the word “female.” In front of it, all women who have gone to women’s college are at a disadvantage. The engineers tried to correct the problem, but failed.

The reason Amazon’s recruitment tool is biased is that it uses a database of resume samples that are mostly male.

Ajunwa also pointed out that online recruitment algorithms have certain restrictions for those who apply for white-collar jobs. For example, in 2016, Goldman Sachs launched automated interviews as a way to hire more diverse employees. However, in a review article published in the New York Times in 2019, Ajunwa argued that too much automation creates a closed-loop system without accountability or transparency.

Ads created by algorithms will encourage certain people to send resumes. After the resumes are automatically screened, a few lucky ones will be hired and then automatically evaluated. The evaluation results are repeated to establish standards for future recruitment advertisements and selection.

She cited two examples of automatic employment discrimination. One case was a lawsuit initiated by Lisa Madigan, the then Illinois Attorney General in 2017, because found the automated recruitment platform itself discriminates against older applicants. There is a drop-down menu on the platform for selecting the number of years people enter the university. The scope of this menu does not go back enough to accommodate workers of all ages, even if people have worked for many years.

In another class action lawsuit against Facebook in 2016, under pressure, Facebook finally imposed restrictions on its paid advertising platform to comply with anti-discrimination laws. This advertising platform has a Lookalike Audiences feature that allows employers to select only Facebook users who “look like” their existing employees to place job advertisements. So if the company has only white peopleEmployees, then only whites among Facebook users can see ads; if the company only has female employees, Facebook will selectively target female users with ads.

Ajunwa believes that since employers are free to choose “cultural fit”, then more discrimination is possible. For example, they can do a personality assessment before entering a job, and the Equal Employee Opportunity Commission (EEOC) found in 2018 that this is likely to create a pattern of discrimination against race or ethnic minorities.

The EEOC ruling held that Best Buy’s personality test violated Chapter 7 of the Civil Rights Act, and Best Buy was eventually forced to stop using this test. As part of Best Buy’s automated recruitment process, these personality tests are used to predict worker performance. In addition, prioritizing the “lack of employment gap” can hurt women who have to take time off to take care of their children.

In the short term, the development of automation will certainly not slow down, especially for automated recruitment, but we need to take safeguard measures to prevent increased discrimination in the workplace.

According to lawyers Mark Girouard and Maciej Ceglowski, EEOC regulations require companies to be responsible for hiring decisions and recruitment tools used, and even require companies to retain data in the event of discrimination claims. Therefore, companies can be responsible for workplace discrimination, even if they don’t know why the algorithm chooses one group over another.

In the SSRN paper, Ajunwa also believes that when the plaintiff initiates a discrimination lawsuit, the employer should bear more burden of proof. Therefore, if the automatic recruitment platform used by the plaintiff has problems in its design functions, such as including a personality test, then in court, the employer must provide statistical certification in the audit to prove that their recruitment platform does not illegally discriminate against certain job applicant groups.

In another SSRN paper, Ajunwa suggested that employers should be required to conduct internal and external audits. These audits will ensure that no applicants are unfairly excluded.

The Occupational Safety and Health Administration (OSHA) has recommended audits to ensure the safe working conditions of employees. There is also a safety certification system that can score employers who meet the audit standards. Ajunwa believes that in the automated recruitment platform, an audit and certification system must also be established, and collective bargaining with the employer can also be used to determine the actual audit standards required for the job position and measures to protect applicant data.

At present, the new crown virus pandemic has greatly accelerated the application of artificial intelligence in the recruitment process, and more recruitment agencies have moved job interviews and other recruitment interactions online. In order to limit face-to-face interaction during the epidemic, more and more companies are beginning to rely on artificial intelligence and video conferencing platforms, such as Zoom.

But we still need humans to ensure fairness and justice in the recruitment process.

I can only think that I look like a good candidate on an automated artificial intelligence recruitment platform. I am a 23-year-old young man with a bachelor’s degree, no criminal record, and no disability. An automated recruitment process allows me to get a job at Amazon in less than 20 minutes. My application experience is very good, and the entry speed is very fast. I can work in the warehouse immediately after filling out the application form. But how can we ensure that everyone else has equal opportunities?

Translator: Zhang Momo