Beware of Automated Hiring

Oct 08, 2019 · 204 comments
Renee (Arizona)
I am a 61 year old RN. As recently as 2005, I walked into a hospital nursing office, filled out an application, was interviewed by the Director of Nursing, had a tour of the unit, and was offered a job, all in the same morning. Now that is impossible. I'd like to do simpler work at the end of my life, but I can't. But my resume, which is populated by multiple degrees, years of experience, and significant accomplishments, doesn't fit the computer models. I was turned down for a cashier's job at the supermarket across the street, without even an interview. I've finally given up after almost two years out of work, and I'll collect my first Social Security check on Christmas Eve.
Ellen F. Dobson (West Orange, N.J.)
@Renee Know how you feel. I have 43 years of experience as an occupational therapist. I moved to another state and boom somehow became retired. Must be the resume. Who wants an old cow. We're not as snappy as the young on the computer/laptop/IPad. Companies would prefer to hire the young at lower pay. They can be molded to meet productivity. Of course their knowledge and experience are sorely lacking. But who cares when the profit margin is more important than the quality of care for the patient.
Gary Madine (Bethlehem, PA)
@Renee Exact same scene. I sensed at the beginning of my search that I "had to know someone on the inside". The online process does net qualified candidates for the managers who are looking for new employees. But for the people on the outside who are looking to get in, the processes is little better than buying a lottery ticket. The manager on the inside gets a qualified employee, but will miss getting the *most* qualified candidate.
J. K.Collins (Torrance CA)
@Renee Age discrimination is the last form of discrimination completely allowed (and even praised and encouraged) by every genre of business in America. Doesn't matter whether it is tech, media, retail or education. Doesn't matter that it's against the law (hahaha, try to find a lawyer that would take on a case based on age). Unless you know someone on the inside who is willing to bend the rules, there might as well be a sign outside the door like in the old days...only this one would read "55+ need not apply". The biggest joke and irony; if you look on the "we're hiring" section of the AARP, all you see are young adults in their 20's and 30's.
Marie (Chicago)
I wonder how this system is biased against persons with disabilities? What words could possibly convey the superhuman strength and determination that these individuals need to accomplish practically every single thing, that people without disabilities do effortlessly? What words could convey the conviction needed to overcome first their own physical limitations, and then the narrow mindedness of people who judge someone incapable simply because of a disability. I hope there is a natural positive bias towards persons with disabilities in this system.
Anonymously (California)
@Marie Believe me, if your disability cannot be hidden for the interview, you will not be hired.
Anonymously (California)
@Marie Job descriptions now include “must be able to lift 25 lbs” to get around this. For secretarial positions!
Jack Klompus (Del Boca Vista, FL)
I have two degrees, but I found myself in a situation where I had to apply for minimum wage retail jobs. Up the street was a small, cruddy little Admiral gas station/convenience store which also happened to be part of a chain I'd never heard of. We're talking beer and lottery tickets. There were no Mensa candidates working there, trust me on this. Based on what I saw, tattoos and a smoking habit were major hiring criteria. Turns out I had to apply to work there on line and take a long, long personality test. I was rejected. I just wasn't Admiral gas station material.
Jim (Chicago)
@Jack Klompus Two degrees should be disqualifying for quite a few positions, especially if they are liberal arts degrees. And if you're judging people for having tattoos, no wonder you did poorly on their personality test.
Lil_Cicero (Rome, NY)
@Jim Your cognitive bias is showing in your dangerously backward thinking. In reality, employers seek candidates with liberal arts degrees. Those educational programs teach a variety of skills useful in the marketplace: argumentation and rhetoric, holistic and nuanced thinking, communication skills, analytical and critical thinking, as well as research and information literacy skills. Kindly take your head out of the sand.
Smilodon (Missouri)
I had a similar experience. Yet one of my degrees is a CS degree.
Hugh G (OH)
A college professor once told me that computers do 5 things, (Add, Subtract, Multiply, Divide and they can look at two things and tell if they are the same or different) Hiring is the most important function of a manager, why would you turn it over to algorithms that have to reduce everything to scanning resumes for keywords and scores on a subjective personality test? The only advantage I see is that it takes the accountability away from the hiring manager- he can always blame someone else when they have to fire someone after 2 weeks.
JustJeff (Maryland)
@Hugh G Actually, just 3: Add, Subtract, and Compare. All the result are just designable processes. Multiplication is just iterative adding, and Division is just iterative subtraction. But otherwise, you're completely right.
Hugh G (OH)
@JustJeff Yes, I guess i remember all of those things in calculus a long time ago, when they showed us how all of the magical math in computers was derived. The most amazing thing is that the people who invented all of this technology did so without the benefit of computers, and the companies that hired them did so without hiring algorithms. We all have brains, lets not replace them with computers.
Patrick Henry (USA)
Here's what I've noticed about hiring teams - they have no clue what they're looking for in a person to fit an open position. Companies/HR use algorithms and online hiring because they have no idea how to recruit or interview in person. I suppose they don't want to invest in the process of building a team. Buzzwords on a resume do not predict how an employee will thrive. Interaction with the candidates, clear communication, solid training, and common goals do. Most companies should do away with HR departments - the random team that sits around, nobody wants to visit with, that moves paper this way and that. Wait, there's no paper. The applicant handles most of that online... AN ENTIRE COMPANY should be invested in its culture. Invested employees thrive when they're involved in hiring, company expectations, goal setting, etc. Cut the HR department - saves money, increase wages, equipment upgrades, training, team building activities, etc.
Erin (Louisiana)
@Patrick Henry Google still has HR, but they have an intense musti round interview process that allows the applicant to be interviewed by the team they'd be working with. I think that's a step in the right direction.
Cathy (Hopewell Jct NY)
I worked for a gigantic company in the 80's and 90's, thought then to be among the best run companies in the world. There was no end in sight to the incredible success; and the directors and executives came to believe that the reason was them. Enter the executive development screening program, that looked to the exact qualities that made the current leadership great, in an effort to find create a new generation just as marvelous as themselves amongst all the young up and comers. And they succeeded! More new directors were just like the old directors. The problem? The company's success was related, not to incredibe management, but to huge growth in an untapped market for which there was not a lot of competition. However, over a decade of so, all of that became commoditized, and the old model no longer worked. But the old model of management was there. And today? All those exec search grads are struggling to respond to a market that changed so radically in 3 decades that the industry is not identifiable as the same industry. A little new blood, and a little less cloning would have served the shareholders better. Beware of self-propagating algorithms. They are frequently based on a misunderstanding of reality, and just move that unreality forward to the detriment of all.
Jim (Northern MI)
HR departments are the problem, as noted by others. I have noted that among staff reporting directly to me, the number one predictor of job success is the ability to correctly perform division and multiplication in one's head. While certainly computers can do it for us so we don't have to, the cognitive ABILITY to do it translates well into so many other areas. Naturally, HR won't approve of my seeing how quickly it takes a candidate to respond correctly to "What is fourteen times thirty-nine?" That's after they forbid me to even consider a brilliant high-school-only educated person from the mail room who has for years demonstrated superior problem-solving skills so that I might interview three semi-literate people who happen to have gone to college for five years. People who aren't accountants, engineers or nurses veto the choices of accountants, engineers or nurses who hire for their departments. That's because high-paying employment has become America's new private country club, where only those "like us" are invited in. The C-suiters don't care if you're good at the job; they just want to know how much you're willing to be dumped on and how likely you are to sue.
nerdrage (SF)
@Jim That's an interesting test. My favorite is, would you rather be polite or honest? And the right answer depends on the role the person is applying for. Programmer? Honest. Client services? Polite.
Chris Tsakis (NYC-Adjacent)
I’m 57 and fairly certain automated hiring is the key reason I’ve had no luck finding a job through standard channels. I think I’m being filtered out on the basis of age. All my years of experience are actually working against me. The only luck I’ve had since losing my full-time job has come through personal connections.
A (On This Crazy Planet)
@Chris Tsakis Bingo. What you are experiencing is normal in today's job market. Resumes that reflect more than say 15 years of experience, do not get consideration. If the years of graduation are excluded, it's assumed you're too old. If your years of graduation are included and are beyond 15 years ago, you likely won't get consideration. But, if you're 57, you should have a history of working with others. Keep contacting your personal connections. And be as proactive as you can about contacting former colleagues, no matter whether you reported to them, they reported to you, or you were a peer.
misterarthur (Detroit)
@Chris Tsakis Me, too. I've made it to the interview stage (amazing that I got through the automated CV algorithm) but one look at me (I'm 64) and basically the interview is over.
In the know (New York, NY)
@A As a rule it's a good idea to limit your resume to 10 prior years of experience. If you've had more relevant experience you can mention that in the cover letter.
Anonymously (California)
As someone who worked in medical insurance I know that it cost $1500+ per month to insure someone in their 50’s and $350 per month to insure someone in their late 20’s. While I personally prefer repairing and adapting the ACA rather than ‘Medicare for all’, I do want to stress that removing insurance cost from the hiring equation would do so much to level the hiring field for older applicants. A hiring manager recently told me that in a hiring of 8 people they deliberately chose to hire more older people that time because, “even though they do not learn as well” they were having trouble with the ‘work ethic’ of the younger group chosen previously (tardiness, excessive absenteeism, always on their phones). The belief was that older workers have a better work ethic. Older applicants would have far more opportunities if you completely removed insurance cost from the hiring decision because their experience actually is valued. But insurance cost negates it. I believe that is one of the reasons that algorithms screen out people over 40 (yup, 40, not 50).
Nikki (Islandia)
@Anonymously I wish I could recommend this comment 1000 times. As long as the burden of health insurance rests with the employer, age discrimination will be a reality. Employers who self-insure (fund the claims themselves and just use the insurance company as a processor) are especially vulnerable to an employee with an expensive illness.
Matthew (Midwest)
@Anonymously So that hiring manager discriminated on the basis of age, except against young people, based on ridiculous stereotypes. Progress!! /s
Hugh G (OH)
@Anonymously Depending on the job, $15 K in less employee costs per year can be easily wiped out by incompetence or lost productivity. If you take a $50 k/year salary and add social security and other overhead, the total costs for an employee can easily increase by $30 k per year or more, and at that point the extra medical costs can become much less significant. I guess this is being pennywise and pound foolish as they say
EWood (Atlanta)
I’m so glad to see this topic being addressed. Years ago I got my start in my former profession because I sent a resume directly to a human; I had no direct experience for the position (technical writer) but had just finished a teacher education program and was having trouble finding a full-time position after the school year had started. The hiring manager—to whom I would be reporting — ultimately hired me, explaining “I like hiring teachers because they know to explain things, which is essential what this job involves.” I doubt such a situation would have happened today: my resume would have been filtered out by an algorithm and I wouldn’t have been hired. (I ended up being very successful in that field, FWIW.) I’ve taken years off to raise my children; applying for jobs now is a hugely time-consuming tasks, with so many organizations requiring you to enter your resume into their job systems. I understand that it may seem faster and more efficient for companies, but I wonder how many applicants rejected by automated systems would have been successful in the roles they apply for. When I hear, for example, about thousands of unfilled technology jobs, I am curious how many would be filled if applicants hadn’t been screened out by AI. Technology’s a TOOL, like a hammer, that is meant to make a job simpler; however, we have come to treat it as an end to itself and I’m very concerned we are allowing ourselves to become enslaved to it, much to our detriment.
Brother Shuyun (Vermont)
Given the lack of hope that young people have based on climate change and other devastating realities... And the lack of hope that people in their 40s and 50s have with no health insurance, no jobs and possible (probably) no social security... I declare the time beginning in about 2001 the "Dark Ages" for America. I think the reaction for many people is going to be to just give up. In fact Republicans are counting on it.
Gary (Chicago)
There are ways to check for disparate impact and even infer whether rejected choices would have done well. Financial institutions do it. Why not people using hiring software?
JEB (Austin TX)
This is not just misguided. It is ridiculous.
Suburban Cowboy (Dallas)
Junk process. I have quit feeding this beast. Online applications are infuriating. I feel sorry for those who must. Imagine by using AI and RoboInterviewers the hiring company then never has a warm body to whom an accusation of bias or discrimination could be levied.
Angieps (New York, NY)
It's not the program, it's the options chosen by the implementer, which I believe is the point of this article. It's just goes to show that old fogies and Luddites are right. Software and equipment may be more efficient but no machine can truly replace the human element.
Lauren Nolls (USA)
My company recently told us to think about “culture add” rather than “culture fit”, which is quite the opposite of what the AI would do. We’ve added some excellent new hires. We’re a tech company.
pinewood (alexandria, va)
Professor Ajunwa has made an important long overdue contribution to the insidious use of automated job interviews. Many of the firms peddling these algorithms claim to be using artificial intelligence methods, as if the users should be bowled over by the mere mention of AI as the new standard for excellence. In fact, the "algorithms" being touted are a corruption of AI, which begins with a blank slate, a set of objectives and proceeds to use the activity, in this case, an interview, to determine whether the applicant is suitable for the job. This is preposterous. What the "algorithms" actually do is use a database of prior interviews vs. job performance to score the new applicant, without a whit of consideration as to how/why the applicant cannot be judged by the database. The most recent misuse of AI for job interviewing is the case of IBM using it's AI program with "Watson: to predict which employees will quit. This is another example of the HR quackery that Professor Ajunwa has kindly exposed.
Ted (California)
Anyone who has looked for a job in the past decade is all too familiar with applicant tracking systems, of which Taleo is the most reviled. They convert an uploaded resume into succotash, demand the same information multiple times, and challenge the applicant to figure out what error caused it to reject and erase entire pages. After spending hours entering and re-entering all the required information, two minutes later comes an e-mail: "After thoroughly reviewing your application, we have determined that it does not exactly match our requirements." Or far more often, the application disappears into a black hole. Taleo came into vogue during the Great Recession, when employers felt besieged by hordes of applicants storming their fortress gates. Taleo's "artificial intelligence" was a defensive moat that handily drowned most of the applicants. A downsized HR staff would only see those few applications with keywords that sufficiently matched the lengthy list of "requirements," often an impossible melange of wish lists and two or more disparate jobs. As it was an "employer's market," the user-hostility was a feature rather than a bug, selecting persistent, compliant applicants willing to endure being treated like trash. Taleo is a symptom of a completely dysfunctional employment market. It shortchanges employers and applicants alike by immediately turning away the best applicants, who refuse to tolerate the dehumanization and disrespect "Taleo" has come to signify.
Josie (San Francisco)
@Ted Completely agree with this. With 20 years of experience with the same employer, I decided, several years ago, to make a change. Dealing with Taleo (which, as far as I can tell, is used by *everyone*), was a nightmare. I researched how best to apply using the tool and made sure my resume contained all the "buzz words" for the position so that it wouldn't get rejected by the software, but watched, again and again as I received automated messages saying that I was not qualified (or, bafflingly, in some instances, that I had withdrawn my application, when I hadn't). This, at companies that I knew were desperate for staffing. Whether or not I was a good fit for the company is certainly a valid question, but based on my skillset, I should at least have gotten an initial interview so that determination could have been made. But I couldn't get past the darn machines. Even at my current employer, I applied once for a similar position in the same department and it went no where. What was ultimately successful was being able to get my resume into the hands of an actual person at the company. Once I did, I got an interview and was hired in a flash (and promoted less than a year later). And when I told them that only a month or so earlier, I had been rejected by their software, everyone was shocked. Technology is not always better.
Tamza (California)
IF applicants had the guts to boycott companies using these techniques -- we might see a change. Tech workers [the ones doing the AI etc work] MUST form unions to get better treatment. AND universities should control the numbers admitted to these programs, just like med schools.
beth (princeton)
I was “invited” to a “video interview” with Unitedhealthcare...by computer, using AI technology, not a human being. In researching this, I learned they record and analyze every observable thing...eye blink count, micro inflections in voice, of course facial expressions...and I am sure that is all analyzed to determine age, ethnicity, and who knows what else. United makes its money by packaging and selling data. There is a possibility, or better, that all the data collected by this insane and utterly dehumanizing approach will be used in some way that could lead to ways I can’t even dream of. Though I have been unemployed for more than a year, I declined, passively. I could not even reply to the invitation from “Recruitment”; my email was returned undeliverable. Are there words for this? Are there words? For this??
Eric (Texas)
Instead of worrying about machines becoming smarter than humans, we should also worry about humans becoming dumber than machines. This is an example of the 2nd case.
AIM (Charlotte, NC)
The paid employees of big corporations at the Capital Hill and DC will bend over backwards to help the "business". Don't expect any federal laws that will help job seekers. You can thank the intellectually challenged voters for it.
A. jubatus (New York City)
An automated hiring algorithm is a tool, like a hammer or a car. The utility derived from them relies solely on the people using them. If you're a lousy driver, owning a Ferrari is not going to help you. We tend to get caught up in the idea that a tool can save us from the hard work of making, in this case, good, unbiased hiring decisions. Having worked in that field for a long time, I think I know why: it's because many recruiters and hiring managers are not good at making hiring decisions. They think the new tool will save them. But they'll just crash it into a tree and then blame it for their shortcomings.
No big deal (New Orleans)
Lol, the writer of this piece doesn't give a hoot about the employer who is looking for the best applicant for the job. He just seems interested in so far as it helps the employee. This is backward thinking, as if employers are just engaged in "make work" jobs for people to fill instead of actual jobs that require the best employee. Or do they? Perhaps that's what the writer is arguing for, "the good enough employee"?
Photogirl (Norristown, PA)
@No big deal Hmmm...I don't see evidence that the author is proposing things just to keep people employed at the expense of employers getting the best workers. The author is pointing out that algorithms can perpetuate existing discrimination by excluding even excellent, qualified potential hires based on race, gender, age, and other factors. (Your argument suggests that these are the factors that pinpoint employees who are not "the best.") The author is also suggesting that these algorithms don't account for individual issues that can't be quantified by an algorithm--like, say, the mom who has gaps in her employment due to child-rearing, eldercare, etc.--yet is still a totally qualified applicant. I myself could have easily fallen into that eldercare gap, even though I have excellent, 100% up-to-date skills in my field. If you got sick, you too could fall into that gap. The point of the article has nothing to do with hiring unqualified people.
Rick Tornello (Chantilly VA)
As a technical recruiter I dislike the use of AI tools to the exclusion of old fashion recruiting skills. Too many people rely on it alone to make the interview decisions. I could go on with examples and reason of poorly written job positions then placed in an AI type database search system only to fail. My partners and I build and sold one of the first national resume database systems in the US. It still required the use of the human brain to review and consider the request verse the verbiage in the position description.
bronxbee (bronx, ny)
there are many types of jobs that are basically "people" skills jobs -- executive assistants, sales people, medical techs -- online automated job applications have no way of conveying those skills to a prospective employer. the supposed "neutral" questions on an online application do not measure the ability of a person to think on their feet, stay calm in a crisis, work well with clients. the only thing these programs do is (1) discriminate by age, (2) discriminate by previous or desired salaries, (3) by online media presence (online too much or not enough). for example, most people applying for a job at say, a firm with heavy document correction tasks, have a firm grounding in basic tasks... so what can they possibly find in an algorithm that can help them decide? can the person proofread as they go along? recognize editing symbols? adapt to the person who does a million inserts and writes on the back of the document? the questions asked do not allow for individualized responses as to strengths and weaknesses-- only what a compute can deal with. although in my experience, interviews with people for jobs were similar in the nonsense they made decisions on. how many words a minute can you type? what does that prove? after the tests, no one ever asked you that again. how much money do you want? too much, you're out. it's always stacked against the employee, no matter the method.
Alexia (RI)
Good comments against automated hiring. But please remember that companies only exist because you buy their products, and that in there lies a solution.
Stephen Merritt (Gainesville)
Garbage in, garbage out.
Nancy Robertson (Alabama)
Automated systems should play absolutely no role in hiring decisions.
Eric (Texas)
The dangers of AI are usually described as the 'singularity' when people are no longer smarter than machines. There is also a slow erosion of our individuality, independence, and humanity when we allow machines to make decisions affecting peoples lives.
CR (Seattle)
Generally speaking job seekers have better luck with public agencies and small employers who do not use this pernicious and humiliating technology. However if you must apply with employment agencies or large companies never, repeat never, show a “gap” in employment. If you have one cover it by creating your own consulting or other self-employment entity. For the most part AI can't tell the difference.
michaelscody (Niagara Falls NY)
"this new doctrine would allow for the burden of proof to be shifted to the employer." In other words, guilty until proven innocent? I think not!
Kevin Brock (Waynesville, NC)
@michaelscody We need to always give the benefit of the doubt to the employer instead of to a prospective employee? We should choose the power of the corporation over the rights of individuals?
Rick Tornello (Chantilly VA)
@michaelscody Burden of proof on the employer? That's not going to work especially with some of the databases. And some of the positions are so unique that there may only be a few candidates who fit the minimum requirement, for example: the candidate must a current specific agency Full scope polygraph and the be able to code blue unicorn and AWS. They don't care if you wrote the book, no tickets = no interview.
michaelscody (Niagara Falls NY)
@Kevin Brock If I accuse someone of something, be it employment discrimination or any other offense, it is now and always should be up to me to prove he is guilty; not for the accused to prove he is innocent.
Nat (NYC)
Is automated hiring really that common? I don't think so, but the author doesn't say.
Louis (Denver, CO)
@Nat, For small businesses automated hiring may not be that common. However, for large companies it is fairly common, if not standard, and is becoming more prevalent at medium-sized companies as well.
Karen (CA)
@Nat In the large corporation at which I worked, applications were entered online and screened by a computer program. Someone in HR would go through them manually to flag some for rejection (most frequently those rejected had qualifications well below those required of the position) or for further review. As a hiring manager, I would then review the relevant applications. People were subsequently screened by phone, interviewed in person and references were called. The field is moving quickly though, so it wouldn't surprise me if more and more of the steps were given over to the software. Even the first screening step has the potential for discrimination.
Rick Tornello (Chantilly VA)
@Nat yes it is. It also allows for espionage. The data people put on some of these AI databases is astounding.
Nick Corcodilos (NJ)
Automated hiring is one of the biggest rackets going in America today. Database jockeys selling phony "solutions" to a very gullible HR profession. Yet there is nothing in the research literature to suggest any of these AI systems are valid, much less reliable. Ajunwa is not the first to rattle this cage. See https://www.asktheheadhunter.com/13461/ai-robo-interviewer
Rea Tarr (Malone, NY)
What laws would prevent unlawful employment discrimination? When a job applicant is passed over, who makes the decision that he or she was a better candidate than the person hired? How do we force someone to hire a guy who makes him feel comfortable? How do we demand that a woman employ someone who she senses is a jerk? What law will ever keep us from choosing whomever we feel like choosing?
Ed (Small-town Ontario)
Age discrimination is rampant, and automated screening simply enforces it. The underlying problem is health care. When I was attending company-sponsored outplacement classes held in Michigan in 2014, we were specifically told not to put more than 15 years of experience on our resume, and not mention any graduation date that could put us in the dreaded "over 50 job seeker" category. From a career including a decade of Corporate Finance experience, I believe that the ageism is driven by a desire to avoid the health-care costs associated with insurance for older workers, which can go north of $20K per employee over age 50. The lack of universal health-care coverage and the outrageous costs of the US health care and insurance system(s) are the root of the problem.
Carl (KS)
"According to one lawsuit, a college student with a near-perfect SAT score and a diagnosis of bipolar disorder found himself rejected over and over for minimum-wage jobs at supermarkets and retail stores that were using a personality test modeled after a test used to diagnose mental illness." Is it possible the SAT score was the disqualifying factor? Most businesses would consider the likelihood (or lack thereof) of the applicant being satisfied, and staying, with the job after being trained.
Viv (.)
@Carl No, it's not possible that the SAT score was the disqualifying factor. Supermarket job applications don't ask for SAT scores. They do administer personality tests, allegedly to determine if you're going to steal their products - as if they don't have in place video cameras and security to notice when somebody is stealing.
michaelscody (Niagara Falls NY)
@Viv Security cameras only cover the areas they scan and only really work after the theft has occurred. Prevention is always better than punishment.
bronxbee (bronx, ny)
@michaelscody are you assuming that bipolar people (many of whom take medication that helps control the condition) are more likely to be thieves than those who aren't bipolar? or that those with high SAT scores are more likely to be thieves or bipolar?
Voldemort (Just Outside Hogwarts)
Dr Ajunwa wants a magic bullet, something to guarantee that no one is ever discriminated against, ever. One might ask whether Dr Ajunwa has ever run a full-time business, with more than 100 employees, for several decades, to have experienced the world of hiring and firing over a span of time during which Dr Ajunwa could learn why businesses operate the way they do. Instead of clinging to the conspiracy of racism, sexism, and age-ism. Here’s another approach that seems to be beyond Dr Ajunwa’s ability or imagination to consider. What if there were 10 times as many employers, employing 10 times as many people, as there are now? In such a world, employers would need to hire nearly everyone walking by their doors - what effect would this have on the hiring process? Perhaps Dr Ajunwa should study that possibility, including the things today that prevent that from happening. Then maybe instead of demanding government intervention, there would be an actual long-term solution.
Kevin Brock (Waynesville, NC)
@Voldemort No. Dr. Ajunwa does not want to see automated hiring/screening systems whose algorithms violate Federal and state anti-discrimination laws.
Rick Tornello (Chantilly VA)
@Kevin Brock and in my world there are more openings than people who meet the bare minimum walk in the door tomorrow qualifications.
Alfred E Newman (New Jersey)
It's no wonder we have such a large and growing homeless population in America. Barriers to employment entry are stacked so high against today's job candidate, not to mention ongoing systemic layoffs.
FR (USA)
Algorithmic hiring already does make it worse. Many of the input forms for these algorithmic platforms insist on age or demand indications of age, allowing employers to discriminate with impunity--not that they don't already. Some outdated forms still ask for salary history, even in states that have outlawed such inquiries. Although disability may cause gaps in employment history, most online forms don't take that into account. An algorithm may downgrade an applicant for that gap that disability caused. Deploying a discriminatory algorithm should be as impermissible as allowing an employee to do so by bias.
exBCer (Burlington, MA)
I once applied for a job where the hiring manager knew they wanted to hire me for the position. They actually filled out all the questions from HR on my behalf. I was rejected as being not suited for the position.
Latisha (Newark)
I think automated hiring helps an employer weed out thousands of applicants based on many factors that lead to bias. "Black" automated no. "Chinese" automated yes. Without face-to-face interaction, it's difficult to prove there is bias however it's always a measure to allow the company to do whatever they want even if it's discrimination.
Howard (Los Angeles)
Back when I was in school, I and a friend wrote to a southern White Citizens' Council asking for support for our "invention," a cigarette machine that used a photocell to determine the skin color of the person inserting the money. We said it could dispense the cigarette in one of two trays: one labelled "white" and one "colored." We called ourselves the Jacques Corbeau Manufacturing Company. The Citizens' Council replied thanking us and praising our ingenuity, though regretting that they couldn't fund us. But that was a students' hoax; what Professor Ajunwa is describing in this article is all too real. Humans who discriminate can be called to account in court. But uncovering the intention of an algorithm is much harder, and beyond the resources of any individual.
Viv (.)
@Howard Video interviews are automatically put through facial recognition software, where they can immediately determine the race and age of the person. Facial expressions are analyzed as well, so that only the most photogenic of individuals make it past the screening process. The irony of these video interviews is that it just screens for people who are very good at video production instead of being good at their real jobs - like financial analysis at Goldman Sachs. Unless you're hiring somebody to be on tv and read a script, they're useless at determining who's actually qualified for a job. The accounts of Instagram "influencers" are routinely debunked as fake by people who inevitably catch their unedited photos/videos.
Susan (Boston)
Any job seeker, however well qualified, over 55 will tell you that he her she rarely or never gets beyond the online application stage. One college graduation date and you're done for. And that is so basic it doesn't seem to warrant the label "algorithm." So the more nuanced stuff is far more ominous. I don't even think all these algorithms are serving employers well, just absolving them of responsibility.
FR (USA)
@Susan You're right, but it's more insidious. An algorithm can be tweaked to discriminate without leaving much evidence behind, save the end result. Companies now get away with non-algorithmic discriminatory hiring proxies, e.g., by national origin (only hiring or interviewing from certain regions), by economic class (only hiring from "elite" schools), or age (as you mention). Imagine how much easier that will be when hidden behind an algorithmic door.
Snowball (Manor Farm)
Let companies hire people however they want. The companies whose hiring practices bring them the best people will thrive. Those who rely on computer algorithms will surely pick less qualified people and will not thrive. In time, everything will shift to the old-fashioned way of hiring, because it is obviously and demonstratively better.
Chorizo Picante (Juarez, NM)
There is not a single concern about "automated hiring" that does not apply identically to non-automated hiring. "Algorithms" are nothing special. They are just rules set by people and they function according to the "garbage in, garbage out" principle. What the author is complaining about is called a "disparate impact" theory of discrimination, in which using neutral criteria can have a bigger effect on protected groups. Nothing new about that either. And there already are federal laws and EEOC regs. requiring employers to keep demographic and other data on applicants.
FR (USA)
@Chorizo Picante Actually, new AI algorithms do implicate problems that don't arise in non-automated hiring. New AI algorithms that "learn" from data may arrive at discriminatory results independent of algorithm design. Indeed, an algorithm trained on U.S. hiring data would probably be trained accidentally to discriminate at inception, especially in Silicon Valley, where discriminatory hiring practices are the rule. The Federal laws and EEOC regs with respect to illegal human discrimination are largely ineffective, especially against smaller employers. Those laws and regulations will be even less effective against obscure algorithmic choices that derive from the datasets of a largely discriminatory society.
Jonahh (San Mateo)
Basically, the world has come down to networking to get a good job, and that ensures (in general) that white people will always hire white people unless forced to meet a quota. What's worse is that if an applicant makes it through the biased automated system the hiring manager/teams will scan social media accounts and other online presence. This is an easy and truly undetectable way for a candidate to be disqualified based on race, sexual orientation, politics or any other criteria the company (unofficially) considers 'undesirable'. Very, very sad state of affairs.
KM (Pittsburgh)
@Jonahh If white people will always hire white people, then how do you explain the number of asians in successful companies, especially in Silicon Valley?
Laume (Chicago)
Great way to discriminate against older and/or more private people too.
Eben Spinoza (San Francisco)
Milton Friedman long pushed the idea that the sole obligation of the corporation is, in machine-like fashion, to maximize the "value" delivered to its owners. So it makes sense that cheaply identifying new parts for the machine (and disposing of those that become worn out or expensive to maintain) is now being automated. The problem isn't just that there's bias in the automated selection systems, its the premise that they are built on. Alas, there's no escape from the machine world of Triangle Shirtwaist 3.0.
RAR (Los Angeles, CA)
I am no fan of ATS' (applicant tracking systems), because they are deeply flawed and have been shown to screen out very qualified candidates. However, human beings can be just as biased or worse. There was a study done back when human beings reviewed resumes where they found that resumes with male names received more calls for interviews than those with female names with the exact same experience. Even if you pass through the system, the human process of interviewing is biased and many people are often rejected due to lack of "cultural fit" which may be conscious or unconscious bias. Many companies are not deliberately trying to be biased, I don't think most understand exactly how these systems work (that they buy from third party system vendors such as Oracle). These systems should be audited and if bias is found, the organization should be required to remedy the problem.
bbwhitebook (Paris)
This is not good: "We should change the law to allow for a third method for plaintiffs to bring suit under the “discrimination per se” doctrine. As I describe in a paper, this new doctrine would allow for the burden of proof to be shifted to the employer." The last sentence contradicts, in fact assaults, basic principles of law. The plaintiff in a civil suit like the state in a criminal action must prove that the defendant has done wrong or harm. This is not a solution: it turns an accusation into a presumption of fault or liability. And in many cases it would require the defendant to prove a negative. Lots of luck.
GerardM (New Jersey)
Automated hiring involves, both, the specification of jobs with highly specific tasks and the automated hiring algorithms that depend on that specificity to cull out the worker with the best fit. That's one reason why a person applying for a cashier's job at Wal-Mart who has a Masters in Accounting would be bounced, not as unqualified, but as a bad fit. The deeper issue here is that the nature of many jobs are being designed to fit the hiring algorithm, not the other way around. That's one reason that while there are many jobs available, there are progressively fewer good jobs available
Eric T (Richmond, VA)
Seems that a good method to test these automated systems would be to have the company's existing employees' resumes fed into it and see just how many of them would be hired now. As the HR department has a record of employee accomplishments from performance reviews, it should be easy to see just where the system fails to find the best candidates.
In the know (New York, NY)
HR resume systems have been flawed for years - even the people working on them agree. Hiring managers complain that HR doesn't present impressive candidates, whose resumes they pluck from these same systems. I once received an automated rejection one day after applying for a job online - this was after my contact instructed me to "get my resume in the system" as he passed my credentials along to the hiring manager (who called me a day later to schedule an interview). Oh, and don't get me started on online application portals with glitches. Best way to find a job is through personal contacts. Was true then, is true now.
TLC (Omaha)
I am self-employed and have been working out of my home for the past 11 years. By happy accident, last winter I discovered contract work opportunities posted on several job sites. I thought getting contract work would stabilize my income and open a new world of work opportunities for me that aren’t available where I live. First step: Revise my resume. I hadn’t done a job search since I started my business, so I had a lot to learn. Revising my resume to get past the scanners has proven to be the biggest challenge in this search. I have had six corporate recruiters review my resume, and they suggested only minor changes. The pay rates offered in these ads are within the range that I’m seeking. I have 36 years of experience. I am not seeking benefits. I customize my resume and cover letter for each application. However, the only successes I’ve have come through personal contacts and being contacted by recruiters. I cannot seem to make it past the scan-the-resume phase. I have never had this problem in my career — until now. It seems that the most important part of the hiring process is trying to guess the right keywords and loading up your resume with them to get you through the scanners. How does copying and pasting a bunch of phrases from the job description (a highly recommended practice, BTW) present a candidate in their best light and show their talents/skills? Anyone can do that.
J Shanner (New England)
@TLC I've had a very similar experience. In one case where I've was told that I was by far the most qualified applicant, I was unable to land an interview. Algorithms are not unbiased. They reflect the biases (conscious or not) of their creators. There is a definite bias against anyone over the age of 40, and against women. Maturity, accomplishments and sterling credentials count for nothing. I've come to suspect that these platforms are intended to suppress wages.
Charlie (CT)
@TLC Consider also the waste of the candidate's time. I wonder how many tens of thousands of hours have been spent revising the same resume for each idiosyncratic cluster of key words in similar job specs. Add to this the rising tendency of companies to post jobs that are not really open to outside competition, and all the embedded biases mentioned in the article and it increasingly appears that responding to these "jobs" is a waste of time. The only exception might be for post entry level slots. If companies really do want to recruit a broad spectrum of skills then human screeners need to be the gate keepers, assuming they can discern which job experiences are transferable to the new position. Unfortunately, most of the humans involved in the process are too junior to make such a determination. Expecting applicants to adjust CVs for every job they might apply for is a huge, useless time sink, and a theft of a candidate's time.
A (Detroit)
@TLC Be wary of pasting job descriptions right into your resume--sometimes the technology alerts the recruiter who knows you are "gaming" the system. I'd suggest getting your resume formatted for ATS specifically.
James (WA)
I'm confused. We create this new innovative system of automated hiring. We praise how much more efficient and effect it is compared to human beings... but it still has bias and problems. Automated hiring clearly will lock some people, including very hard working and competent people, from opportunities. Having a computer determine the worth of prospective employees is dehumanizing. It's an excessive and stupid obsession with efficiency. So to correct for all that, we are going to keep automated hiring and add safe guards on top of that. Presumably to stop age and racial discrimination, but probably little else. Why not just have human beings hire other human beings? Why does everything need to be turned into an online or artificial intelligence platform? Robots are a poor substitute for people. Often the old fashioned analog way is much better.
ChicagoMaroon (Chicago, IL)
@James I empathize with your sentiments. There are definitely areas where automation proves beneficial. And in areas like hiring a workforce for a company (a 'company' is a business enterprise comprised of a company of people - this is something everyone forgets) should be done by humans. The other problem of is the prevalence of 'efficiency addicts.' These people grade every activity or event in life as efficient or not efficient. If an event is not efficient, make it efficient; if an event is efficient, make is more efficient. On and on we go.
Jim (Northern MI)
@James Because, James, we're no longer people. We're "human resources", no more valuable than a chair or a coffee pot.
James (WA)
@Jim I think the coffee pot may be more valuable.
Kiki Gavilan (Oakland)
Totally agree: “We should change the law to allow for a third method for plaintiffs to bring suit under the “discrimination per se” doctrine. As I describe in a paper, this new doctrine would allow for the burden of proof to be shifted to the employer.” In almost all ways, employment laws have not kept up with technology or evolving norms and the “rights” of women and people of color are meaningless without access to data, ability to enforce or seek remedies. This must change.
KM (Pittsburgh)
@Kiki Gavilan So guilty until proven innocent? Sounds great, of you want to undermine the fundamental principle of justice that our legal system is based upon.
XXX (Phiadelphia)
So many qualified candidates are being overlooked for many positions leaving a pool of applicants that is less qualified than it should be. I've seen, first hand, this occur at a high tech firm many times over. Too much reliance on AI/ML when dealing with large populations. I build neural networks and the training data needs to be perfect to get reasonable results. I suspect there is a lot of rush to implement AI solutions that the training data is flawed and shallow.
Rick Tornello (Chantilly VA)
@XXX the search engines in most of the AI on line systems are low cost and almost useless. The good ones are way to expensive for the database companies to invest in and license out.
Eric (Texas)
Automated platforms for automated people. We should just make it illegal to use 'automated' platforms' that are the basis of hiring decisions. This is regimentation similar to that practiced in China.
Eben Spinoza (San Francisco)
Predictive analytics, which is really what we're talking about with the so-called "hiring platforms," have the nasty feature of reinforcing the status quo. The first gusher of detailed, individualized, and plentiful was discovered when Diners Club invented the credit card in the early 50s. Not long after, Fair, Isaac and Company developed its credit rating analytics product (now known as FICO). At first loan officers used the rating, to augment their judgement. In those days, mortgages were often issue by small local Savings & Loans (remember them?), so the officers were frequently acquainted with their customers. Eventually, of course, a good FICO score became a requirement for a mortgage. Funny thing is that the score packed within it all sorts of social phenomenon, and helped to make redlining easier and "more objective." That funny little score turned out to shape our cities and neighborhoods for the past 70 years. Automated hiring is just the next step in the development of our great Panopticon: sneeze in the wrong direction and you'll be out of work. We're being bent, spindled and mutilated.
Chorizo Picante (Juarez, NM)
@Eben Spinoza The problem is that reality is not politically correct. We want to believe every person and group is equally qualified and creditworthy. The objective data say it aint so.
FR (USA)
@Chorizo Picante Not quite. The "objective data" that you say indicates that every group is not equally qualified and creditworthy derives from bigoted social practices that gave rise to the data in the first place.
Phytoist (USA)
Actuarial,Mathematical,Statistical & Algorithmically derived problematical probabilities through computer outlets have no sense to understand true biological facts and their effects on living organisms in real sense. Instead limiting their control based upon data feedings indicative models,we as humans are trying to subjugate them upon our lives & destiny. Let tools we use to facilitate little easygoing in our lifestyles be tools only,not our masters in commands. Phytoist.
François I (Fontainebleau, France)
I wouldn't be surprised if the companies that create automated hiring software also start to create software for candidates to beat the very system that they created. In fact the beat the system software sold to applicants may be worth far more given how many people would purchase it. Make money on both sides of the equation. Capitalism at its very best?
Robert M (Mountain View, CA)
If there were a genuine shortage of labor, companies would not be using arbitrary criteria like high school graduation dates and employment gaps to automatically screen out job applicants; nor would they reverse auction jobs by asking "Minimum salary accepted?" in online forms. These methods flourish because of a surplus of labor in most occupational specialties at most experience levels. The broadcast media like to repeat the labor shortage myth, quoting company press releases and government unemployment statistics that ignore the low labor participation rates that have been occasioned by these automated screening practices.
JustJeff (Maryland)
AI can introduce accidental biases which can be very difficult to identify, let alone remove. E.g. It was puzzling for years why face recognition software could identify white faces better than black ones. Until it was finally determined that when the prototypes for the software were developed, the companies which did it used the faces of each team on hand to 'train' the software to do its job. Unfortunately, those teams were mostly (and in one case entirely) white, thus introducing an unintended bias into the systems. Those same systems are now being retrained using faces of people of color and they're getting better overall, but it took over a decade to realize even where the problem was.
Mark (Wyoming)
Creating a third path to pursue litigation for biases real or imagined seems to be highly inefficient and costly. Instead why not have a certification program where the hiring program is reviewed by an outside party and if deemed fair, would not allow litigation as to its fairness. If the conclusion is no testing service could determine if the program is fair then litigating something that no one can get right seems unjust as well.
Mike (Vancouver, Canada)
Several comments here cite the old "garbage in, garbage out" aphorism as if it applies to these flawed algorithms for screening job applicants. But in the GIGO aphorism the "garbage in" is the data, not the algorithm or method of processing the input. This is ironic because the GIGO point of view assumes that the algorithm or method would produce a great outcome if only it was not fed useless data. In the case of automated hiring platforms, the data are just fine, but the method is broken.
SDC (Princeton, NJ)
@Mike part of the data passed to the algorithm are the parameters for the search. This could also be part of the G in GIGO.
Mike (Vancouver, Canada)
@SDC Nope. "Parameters" = "part of the model"; "data" = "what the model is compared to". I teach this elementary distinction to my students. In this case, "parameters" are set by the company running the algorithm, and the "data" are provided by the applicants. I agree that only one of them is garbage in this case, but that's not what GIGO was meant to encapsulate.
Shend (TheShire)
In the early 1990s I was involved in something similar in the insurance industry - automated underwriting, which removed human underwriting from risk selection. It worked. In terms of profitability and selecting the best risks we found that automated underwriting significantly outperformed human underwriting. However, just like automated hiring, automated underwriting "automated" a certain amount of traditional risk discrimination that was there under human underwriting, but gave the insurer more cover with regulators by saying that humans weren't making the calls, it was the underwriting model/software, which the insurer would state was discrimination blind as respects to such things like race and gender. But, models can be developed to discriminate against anything including race and gender without ever being provided that specific input, and that's the issue.
rb (Boston, MA)
The hiring algorithms and personality tests companies use to screen potential employees are the ideal way to homogenize the workforce. Algorithms don't assess passion, energy, drive, creativity, talent, determination, resilience, or life experience. If anything, they'll cull those who march to a different drummer, the rebels and dreamers most likely to see the world in unique and innovative ways.
jrk (new york)
Algorithms are already discriminating against older workers. They have been for years. Try getting an interview with a graduation date earlier than the mid 80s. It is allowing a generation of hiring managers to be as biased as any racially discriminatory employer ever was. And they do it guilt free because how can a machine be biased? The younger generation has decided that computer skills are more important than having a moral compass or a soul.
SDC (Princeton, NJ)
@jrk don't aim your ire too hard at the younger generation. For the most part they are just trying to survive in the world they were handed. The one where profitability is more important that morals or soul. Younger workers are cheaper. Period. End of consideration.
Louis (Denver, CO)
@jrk Younger workers have their own challenges on the applications side--trying getting an interview when you have just graduated college and don't have 2-3 years of experience doing the exact job you are applying for.
Laume (Chicago)
They can get an internship. (Try finding an internship that doesn’t specifically indicate its for “current students or recent grad”.)
rbkorbet66b (elvislives)
I'm 56, female, have a BA, MA and PhD and 30 years experience in a number of industries in high-profile roles. It took nearly a year for me to find the temp job I now have and that was only successful after getting a foot in the door at a hiring agency - no way did I fit the criteria these job bots determine. This practice also deprives companies of experience and temperament that only comes over time.
Shend (TheShire)
@rbkorbet66b "...experience and temperament that only comes over time." I'm in my 60s, and I used to think that too, that experience and temperament only come over time. But, ever since the election of our current 70 year old President, and seeing how many of his even septuagenarian and octogenarians party associates have behaved I no longer see such things as experience, temperament, tone and even maturity as a function of age and years of experience. Seriously, I wonder whether the current political situation has reduced in the eyes of employers and their automated hiring models the value that an older worker can bring to the table, if they no longer believe that age of applicant is a predictor of temperament and maturity as it once was.
Ivy (CA)
@rbkorbet66b Me too. I cannot tell if age or education or caring for my Mother is the kiss of death, all three I guess. We should start a Soon to Be Homeless Ph.D. club.
Jack Connolly (Shamokin, PA)
As we turn over more and more of our data management to computer programs, we accept more control from them. I am reminded of a quote from an old "Star Trek" episode. Mr. Spock remarks, "Computers make excellent and efficient servants, but I have no wish to serve UNDER them." HR departments use "scanning" software to cull "bad" applicants from the hiring process because they simply don't want to do the heavy lifting of reading resumes and interviewing dozens (if not hundreds) of applicants. But I believe there is a deeper, more cynical reason. Businesses are not looking for reasons to HIRE people. They are looking for excuses NOT to hire people. Payroll is an expense. The fewer people you hire, the more profit you make. Check out your basic job description. Employers want the mythical 22-year-old with 20 years experience who's willing to work for minimum wage. They hire no one, and then they whine about how they can't find "the right people." I made the mistake once of going to a "career coach" and letting him re-write my resume. The result was a document loaded with "keywords" that did NOT accurately represent me. It wasn't even grammatical. As an English teacher, I was deeply offended. I have also learned through bitter experience that "not a good fit" is HR legalese for "You're too OLD!"
Al (Seattle)
Automated hiring is a big problem for the federal work force. Prospective employees must fill out a questionnaire and rank their experience level at different tasks. If you do not rank yourself highly in all areas you will not make it to the next stage which is having your resume scanned for keywords related to the questionnaire. This encourages applicants to lie about experience on the test and in their resume so that they will make it to the next hiring stage.
Liz (Montana)
Applying for jobs takes a huge amount of time and energy. Often times, employers aren't just asking for a resume and cover letter; they want an additional application filled out that covers all the information in your resume. Or, there are short essay questions to answer, personality tests, and productivity/skills tests. To then have an automated hiring platform potentially pull out your resume before it even makes it to the company, what a waste. I'm sure that it can be a lot of work for companies to sort through applications, but that's what HR departments are for. Give applicants 30 seconds of your human attention before rejecting them, we at least owe each other that.
Bobby from Jersey (North Jersey)
I'll bet the ranch that even with automated hiring, a good share of slackers, and other hires the boss can't stand will get through. Then the boss will have a fight with the big boss who implemented the automated hiring. Then the big boss will be reluctant to call it a failure because the bigger boss will tell him the company's stock price goes south.
Jay (Pa)
The book Machine Platform Crowd, by Andrew McAfee and Erik Brynjolfsson is an example of assertions that AI is virtually flawless in the hiring process. This article should do some good at telling such authors, and HR managers who use it, that their creature is seriously flawed, and that it is incapable of compassion, which just one element of common sense.
Viv (.)
@Jay The flaw is by design, and very often done to cover the illegal reason they want to disqualify somebody. That's why there's a plethora of "personality assessment" test companies selling their wares to businesses under the marketing of getting the "best" employees. Instead of just saying they won't hire somebody of a certain age/gender/socioeconomic status, the "personality assessment" is programmed to screen those people out by variables that correlate almost perfectly with those characteristics. Sorry, you didn't score a blue-yellow, and you're not right for this position. Sorry, ENFJ (from Myers Briggs) is not right for a promotion, according to our values right now. It's psychobabble designed to trick people into believing there's legit reasons they're not getting a job or a promotion, when in fact it's just plain old office politics and discrimination at work.
Jean Sims (St Louis)
When computers were fairly new, I heard programmers refer to GIGO. Garbage in, garbage out. Meaning the out put Is only as good as the program handling the input. Big data is the new trend, but the same metric still applies. AI isn’t brilliant, it’s mechanical. No AI system yet has the made the leap to the type of intuitive thinking humans do, which often involves linking two seemingly unrelated ideas to create a new opportunity.
Azathoth (R’leyh)
@Jean Sims Under GIGO, the output is only as good as the input, not as good as the program. The program can be perfect but if the input data are garbage then, well, garbage out.
Emily (NJ)
The current systems also enable the hiring of workers from other countries who will work for substandard wages. Employers now can easily document their inability to find qualified workers and so have no other choice but to import employees. Automated systems also are the reason the economy is in such terrible shape, as judged by the multiple low-paying jobs a person has to hold just to keep their heads above water. And it’s precisely because of automation that 800 applications for a single are able to be made in the first place. Human Resources worked just fine back in the ‘80s and ‘90s. This experiment in automation should be judged for the abject failure that it is and abandoned. While human resource departments weren’t perfect, they were able to actually get people into jobs were they could earn a decent living, contribute to the health of the economy, and actually enjoy living with relatively little stress.
Viv (.)
@Emily Human Resources stopped being about people when they abandoned the name "Personnel". Human Resources makes workers sound no different than office equipment, except now even office equipment is treated better.
James (WA)
@Emily I agree completely with your comment. Especially this part: "And it’s precisely because of automation that 800 applications for a single are able to be made in the first place." Spot on! Honestly, I'd rather go back to paper and snail mail applications. I'd rather each application take time and cost a little money, that's way we have to decide whether each application with worth the cost to us. Applying for a job shouldn't just require pushing a button, it should require legitimate effort and interest in the position. I find it odd that we create all this extra work and headaches using automated systems, so we introduce automated hiring which is bias, so then we add on top of that safe guards against racism and ageism, and and and. Why not just go back to people hiring people? Not to mention get rid of social media and a lot of other technological garbage. The problem isn't quite technology, its the foolish ways people are abusing technology. But in some important ways such as the job market and social interactions, we were much better off in the 1980s and 1990s before things like the internet.
Jeff Bowles (San Francisco, California)
Screening out candidates based on graduation date or old jobs is simply the efficient way of seeing someone with white hair and beard and thinking, he cannot possibly be a programmer. The difference is, an in-person interview gives you all the insult in real-time. I interviewed at one Silicon Valley company after another, at age 55. It wasn't the fast way to a job as it was 20 years before. One company had me interview with a number of technical people, and it seemed to go fairly well. Then the manager met me. I say that because he looked at my hair and beard and gasped aloud, and I was out of the building within five minutes. They do not have management training any more, so he had no clue how offensive (and illegal) that was. We have MBA school to train children to be executives, but that's not management training, it is executive training. (And not very good.)
Smilodon (Missouri)
If you are over 40 you are invisible to tech companies. Wish I had known that before I went back to school to get a CS degree.
Ivy (CA)
@Jeff Bowles I have told some people, "I was using computers well before you were born" and they flat out refuse to believe me. My Mother was too!
4AverageJoe (USA, flyover)
The best and most stable place in my particular profession requires submission of resume, References, and cover letter, including for internal candidates. It I well known that they do a word search of the documents, and if enough buzz words are in there, you get an interview.
Catherine Maddux (Virginia)
There's a phrase that encapsulates this phenomenon: "Dirty data in, dirty data out." How about propagating the idea that we need "community" and more face to face interaction in the face of the obvious downsides of the digital era?
Bluestar (Arizona)
@Catherine Maddux Indeed!!
Dana Broach (Norman, OK)
"Discrimination per se" essentially says "I didn't get the job, therefore you discriminated against me illegally, now prove that you didn't." This seems to me pernicious and would likely lead to widespread abuse.
Viv (.)
@Dana Broach And the system isn't abused now? There's no reason that hiring - especially for for big public companies - shouldn't be a transparent process where the selection process is transparent. This is especially important given that most job descriptions are grossly inflated, both in duties you're actually required to do, and in the education/experience required to do those jobs.
James (WA)
@Dana Broach You have a little bit of a point. But there is also some discrimination in hiring that no one can really prove. And more importantly, I'd rather turn to lawyers and abusing the system if it means disincentivizing bad technology.
Smilodon (Missouri)
As opposed to before, where no abuse whatsoever went on?
Colleen (Orlando)
THis just happened to my niece with Amazon. Self made woman who paid for her undergraduate and masters degree and owns her own home. She is a go getter. Good midwestern values and work ethic. They declined her over night. They lost a diamond in the rough. Too bad.
Gary Madine (Bethlehem, PA)
@Colleen Too bad for Amazon. But also too bad for your niece. There must be a lot of go getters out there too discouraged to even try anymore (thus not counted in the fantastic unemployment numbers) or mopping floors in the local high school so they have medical insurance. That's too bad for employers everywhere who wind up with only nominal employees on their roles.
Viv (.)
@Colleen Consider it a blessing in disguise. I have yet to hear anyone who worked at Amazon (desk jobs, educated people) who didn't physically look like they aged 15 years after working there for only 5 years. Most people bail as soon as they have enough money to take at least 6 months and recover mentally, and find a non-toxic work environment.
John griffin (Brooklyn)
Of course the lawyer thinks law suits are the solution to the problem, why do progressives think money and law suits are the solution to every problem?
Jean (Cleary)
@John griffin It is not progressives. This problem is a National problem. No politics involved. Bias yes, politics no
Joe (Nyc)
@John griffin yeah, you're right. corporations are so open to rational arguments, particularly those that might lower their profits, right? corporations are such responsible, fair-minded entities. I don't know how anyone could ever conclude that a lawsuit is an effective way to deal with a corporation. We'd probably still have a nice dirty environment, seatbelt-less cars, faulty medical devices, etc. etc. if people had just gone nicely to the bosses and raised their concerns politely. That always works with corporations. They are fine, upstanding citizens. Good lord.
Mark (PDX)
@John griffin Why does Donald Trump think a lawsuit is the solution to every problem?
Chip (Wheelwell, Indiana)
Boy, are you about 10 years too late. If your resume has had a graduation year that announces you're old, the employer just sorts it right out. No human ever looks at it. Why pay someone a lot of money for their experience? You can't bully them very well; you want someone young, cheap, inexperienced and desperate so you can blame anything that goes wrong on them, and work them to death.
clear thinker (New Orleans)
I've been seeking work for TEN YEARS, in-field and out, full-time and part-time, in-state and out of state. Five years ago, I even relocated to another state. I have been unable to thrive with a single job as a degreed professional. Until very recently, my resumes and custom cover letters were just disappearing into a digital black hole. No calls; no requests for interviews. It's probably by design. With "record-low employment," someone willing to work - and trying to work - is now destitute and bankrupt. Not very American.
J P (Grand Rapids)
Where was this article in 2005 when it might have made a difference?
David (El Dorado, California)
Progressive Utopia sure has a lot of rules!
Jean (Cleary)
@David What is progressive about this article. Bias has always been present in the hiring process. The problem with digital is that even before you get to the interview phase, you are already cut out of an opportunity of a face to face interview. That is bias of the worst kind.
Louis (Denver, CO)
As the saying goes "garbage in, garbage out" and, like anything else, automated hiring systems are only as good as the assumptions built into them. There are a lot of bad assumptions in many of these systems and platforms--personality tests with questionable predictive value and assessment tests that have very little to do with the job the applicant is applying for. On one hand, there are dozens, sometimes hundreds, of people applying for a single opening while on the other hand there is (or at least alleged to be) a shortage of workers. Something is broken here.
Smilodon (Missouri)
There isn’t a shortage of workers. There’s a shortage of workers who will work for starvation wages in toxic work environments.
Mickela (NYC)
@Smilodon You are so right.
Auntie Mame (NYC)
Hummm.. There seems to be less and less or everything except people. Thinking about the recent college admissions scandal/s (and then the expressed regrets that fewer Chinese students would be coming to pay full tuition!). Now fewer sought-after jobs or jobs in general. In my nabe I look at the unrented store fronts, national food eatery chains replacing local delis and sandwich shops and whatever new restaurants with high prices. But let's expand and think of all those special visas -- e.g. for Asian software developers or Filipino nurses. (I wonder how these are treated by AI hiring.) I'm also thinking of those HR people being replaced by AI platforms. In the end one can conclude that AI is as good or bad as we are. And the world of the future is a very different place.
Red Ree (San Francisco CA)
The last time my company put out a job ad there were 800 responses. And while it's true there's a lot of discrimination in online hiring, there are also a lot of hugely unqualified applicants cluttering up the pipeline. There's no way for teams to evaluate all of this. So we hire professional recruiters who are of varying ability. Automated screening definitely screens out a lot of talent. This harms the company as well as the applicants. The other downside of hiring nowadays is even if we already had a candidate in mind, someone that we had worked with previously or other legitimate experience, we can't hire them without posting a job ad and wasting a lot of other applicant's time, and our own. Recruiting pipelines themselves act to narrow diversity because the job pools are already filtered. To really reach everyone you'd have to start outreach almost down to the grade school level.
Mons (E)
@Red Ree I believe it because that's what ruined using any of the dating apps for me. An overwhelming response from people who should have known there was a 0% chance. I realized that it's because it's online and not in person, they think worst case they get ignored so it's no risk. I think the same applies to jobs, if we relied more on in person interactions those that were unqualified would weed themselves out early and stop clogging the pipeline.
hen3ry (Westchester, NY)
AI is influenced to work the same way HR and employers think. Therefore AI interviews and screening will be biased against older applicants, women, anyone with any sort of mental health issue, or anyone who doesn't answer the way the algorithms expect them to. I say this as someone who, back in the 1980s, got a job by walking into a university, meeting a potential supervisor and getting hired. Today I can't walk into any place of employment, not even a grocery store, and fill out an application much less ask about jobs. As for networking, unless I'm part of the executive level, it doesn't work. People are too concerned about their own jobs to recommend me for jobs at their place of work. All of this is an employer's dream. They have all the employees working themselves to death to keep jobs that are not terribly rewarding. In the last 6 years I've had precisely three decent interviews. By that I mean interviews where the questions asked related to the job, the required skills, and weren't attempts to get at my age or my health status, my marital status, or the year I graduated from high school or college. Getting jobs is no longer about qualifications: it's about gaming the screening process and then getting the interview. That's not a good way to hire the best person for the job.
Cass (Missoula)
For deciding on whether there’s racial or gender discrimination in hiring, I go back to the airline pilot test. Should the criteria currently being used to diversify a given workplace also be used by the airlines when hiring a pilot? How about a hospital when hiring a brain surgeon? If the answer is no, the lack of diversity may still be problematic and perhaps need to be addressed. But, the issue has nothing to do with discrimination.
Billy Baynew (.)
Cass, Most jobs do not require the extremely specialized skills of a pilot or surgeon. These automated systems are used by drones to hire drones.
PC (Aurora, Colorado)
It doesn’t matter if A.I. hiring is misguided, corporate America will embrace it because it’s cheaper. Only after a few deaths, caused because of errors in hiring, will corporate America give pause. And even then, they will not admit to their folly but double down and tighten up the writing of the algorithms.
Rockets (Austin)
Good luck getting a job if you’re a senior. It’s near impossible now but will be worse when bots are doing the selecting. Right now you get weeded out if you have an AOL or Yahoo email address. There’s enough info on the internet to determine ones age quite easily. Bottom line... most work will be gig work anyway. Provide your own car, computer, printer, supplies...etc. I feel sorry for anyone 40 or younger. Real jobs are becoming scarce.
clear thinker (New Orleans)
And WHO, really, thinks 40 to 50 years is old?? Children.
Wood Gal (Minnesota)
@Rockets, years ago I used my MacBook laptop to apply for positions until I found out that employers at the time would disregard applicants with older Apple devices. The same thing happened with using computers at the local library or our local employment offices. You don't have to be older or belong to any other category-you're done applying when you log on.
David (Omaha)
An EU privacy law (the GDPR) largely prohibits this sort of thing. The GDPR says that an individual "shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her." So for things like hiring decisions the GDPR tries to ensure that there'll be a human element instead of just an algorithmic determination. Maybe one day we'll have something like that here, too.
Seethegrey (Montana)
Although I agree with the eye roll of personality tests for a first job stocking shelves at PetSmart, this article does not address two downstream scenarios-- (1) the likelihood that anyone not hired for a job they really want (or assume they deserve) will assume the failure was because of nefarious motives (I'm sure psychologists have much to say about the balance among self-confidence, self-preservation, paranoia and applied experiential microaggression) (2) forcing the employer to justify every hiring decision based on 'acceptable' (to whom?), objective, provable criteria actually cuts against the idea of 'gut instinct' or 'sympathetic urge' or even 'political/social activist choices' to give someone a try (hiring felons? introverts? your neighbors' teen?)
Laume (Chicago)
I thought many employers already wrote up justifications for hires, and turned them in to hr?
bonku (Madison)
Has anyone checked how that widespread culture of reference letters/recommendation, mainly in higher education and research sector, affect quality of employees/faculties those universities are getting?
pernel (Princeton NJ)
"Unions can help to ensure that automated hiring platforms are fair." Unions? What unions?
Lil_Cicero (Rome, NY)
@pernel The unions that the Baby Boomers and Gen-Xers let die.
Smilodon (Missouri)
They were already pretty much dead by the time Gen X came of age
A (Detroit)
@pernel The ones that 16.4 million Americans are members of
betty durso (philly area)
AI is purposely taking our personal decisions out of our hands, and the managers in charge are buying into it. Why? The age-old goals of money and power. We are being treated as a herd of cattle, our identities being branded, bought and sold. I used to wonder when I heard someone's identity had been stolen, thinking no one can steal a person's identity--it's personal. But it has become commonplace and it has nothing to do with the individual. We are being sliced and diced into salable categories according to wealth and our preference in politics and religion and the color of our skin. It has gone too far. Too much power has been aggregated into few hands. We must restore and protect our humanity.
Lil_Cicero (Rome, NY)
Welcome to Hellworld, In the past, it was rare for employers to actually read your resume. In the future, employers will never need to read your resume because they'll already know everything Facebook and Google know about you. It doesn't get any better once you get hired. The algorithm already knows the bare minimum level of pay for a worker with your level of skills, and you will only ever be paid enough to keep you from going elsewhere. If you have problems with the working conditions, tough luck--your boss is an algorithm now too!
David (Minnesota)
People often believe that artificial intelligence will be free of human biases because it isn't human. This is a dangerous myth. AI is "trained" by feeding it vast troves of past decisions and using machine learning to identify patterns. If past decisions were biased, AI will codify that bias into future decisions. But, because of the myth of impartiality, these biased decisions won't be examined and the bias will fly under the radar. Even if they were examined, the bias would be hard to spot. Another myth is that AI "thinks" like humans, only faster. This is simply not true. Humans think rationally, with conclusions following from (sometimes faulty) premises. AI uses complex mathematical equations to find patterns in the data. AI is a "black box", so humans can't judge its logic. Bias can only be detected to careful statistical analyses of the outcomes, and only if you know what to look for. But people resist doing all of that work if they already believe the first myth: that machines are inherently unbiased.
Jim (Chicago)
@David That's not how these systems work. They are trained based on performance evaluation, not on past hiring decisions.
Stan Sutton (Westchester County, NY)
@Jim: I believe that you and @David are both correct. AI systems will be trained based performance evaluations, but the populations being evaluated will be constrained by past hiring decisions. There is room for bias both in the way the training data are selected and in how those data are evaluated.
David (Minnesota)
@Stan Sutton Agreed. Unfortunately, companies can't evaluate the performance of underrepresented groups that they never hired. Amazon, a world leader in AI, tried to develop AI to hire engineers using their current workforce as the training data. They got almost entirely white men in their 20s and 30s because that's who worked for them. To their credit, they cancelled the project. Other companies are unlikely to check the results because they lack the in-house expertise.
SAO (Maine)
I'm studying data science and how to create models that predict the best hire. The models predict the decisions that were made in the sample population. Thus, if the hiring process was biased, the algorithms repeat that bias. If Joe in HR does a first pass of every resume that comes in, then dataset given to the computer will contain Joe's biases and will bake them in to the supposedly neutral algorithm. The company will think the computer is predicting the best hires, but all it is really doing is predicting Joe in HR's hires. Sophisticated modeling needs good data, but it mostly gets the data there is.
Jim (Chicago)
@SAO Only poor university-level models would be so bad to use past decisions instead of employee performance.
Eben Spinoza (San Francisco)
@Jim In some fields, employee performance can be objective, e.g., how many telephone calls did the call center support rep take per hour rated by customers as superior. But for less soul-crunching jobs that doesn't reduce people to disposable objects, the definition of performance is quite elastic, and often maps to how well the employee gets along with the boss. As one, rather extreme version, imagine how job performance is evaluated in the Trump Organization.
Smilodon (Missouri)
But if the pool of employees is already limited by bias, you are then baking that bias into the system by using only performance data from people you’ve already hired.
M (Nyc)
Good article! Usually using a skilled recruiter is the most efficient way to go (for both employers and job seekers)
elzbietaj (Chicago)
As a older job hunter in Chicago, I applauded the decision to eliminate the requirement for a college graduation date. To report companies which continue to require this information, a social service agency encouraged me to forward their names. Many foreign-based companies include the asterisked graduation date on their online applications. Here's another roadblock: companies that use MBTI-inspired forms to weed out candidates. How scientific or fair is that? You might as well pick prospective employees by their astrological signs. When I come across those applications, I bail out.
Gary Madine (Bethlehem, PA)
@elzbietaj A nearby highly technical analog electronic circuits company posts jobs that clearly are not entry level. 10-years of successful product design experience would be required to meet the desires stated in the posting. Yet that company wants to know any applicant's university GPAs. Once I got past the ai recruitment tool and got their attention based on the exact match of my experience with their stated requirements, the lone HR person at that company insisted on college transcripts. Seems like that company has experience in ways to disqualify older candidates without legal blame or social blame (I don't know if requiring year-of-graduation is prohibited in PA). Earlier in my job search, I had been given a kindly tip by a human recruiter for a different company that I ought scrub year of graduations out of my resume.
A (Detroit)
@elzbietaj It is illegal to weed out employees in hiring based on MBTI type--if you suspect this is happening, please report.
elzbietaj (Chicago)
@Gary Madine After I turned 40, I never disclosed my grad date either. I was referring to those opportunities I had to skip because the application required it.
J. Waddell (Columbus, OH)
Is automated hiring really worse than leaving decisions up to individuals and their biases? There was a time when getting a mortgage required that you know your banker - a clear problem for minorities and women. Then credit scoring replaced that system. Whether it is credit scoring or automated hiring, that system is better than using subjective, individual decision making as long as the criteria/algorithms are statistically valid.
Max (Moscow, Idaho)
@J. Waddell Yes, it has been to be worse. Rather than allowing people to correct former biases in hiring that favored men, witness and higher social class, they hiring algorithms perpetuate those biases. They do so by using previous hiring data over decades - hiring data that favored men, white people and upper class - to build the predictive models. These models are not magic. They can only use existing data to "learn" and predict.
Donna Gray (Louisa, Va)
Is this article another push for quotas? The goal should be an impartial fair test of qualifications. I always like the idea of musicians playing behind a curtain to audition for an orchestra position. That way their race/gender/age was unknown and their talent could shine.
tigrr lady (vancouver)
@Donna Gray And you do know what happened when orchestras use behind a blind curtain ? A lot of women welcome this and say this produces more diversity. I have friends in classical world and some say even behind a curtain it’s not 100% “blind” bc it is many times it is easy to know exactly who is auditioning. This is quite diff from AI where what is fed into the computers are not “objective” in the way we would like bc it is ordinary humans, with all our biases, who feed the data into the computer
Donna Gray (Louisa, Va)
@tigrr lady - It is great more women 'win' orchestra positions. And we should care if the result was a mostly female orchestra. It would not be great if less qualified men were given those jobs because of a numerical quota. And the goal of the AI tests is to eliminate bias. They can be improved, of course!
reader (Chicago, IL)
@Donna Gray That's the stated goal, but it does not work that way in reality. In fact, it's not clear that the people who design these systems for companies even want them to work that way. To say that they eliminate bias is just a sales pitch - what they really do is make bias more efficient an depersonalized.
PhillyPerson (Philadelphia)
We could add that a knowledgeable candidate could game the application, especially personality tests. Tests like Myers Briggs have no scientific basis yet are used for all kinds of decisions. The person who was rejected based on a personality test seems to be naive. If you’re having dark thoughts or emotional issues , by all means don’t bring them up on a job application!
Ivy (CA)
@PhillyPerson I took very long "personality test" for the U.S. Postal Service and it was a huge effort to moderate myself yet remain consistent. Wonder how people who later "went postal" did on that test? And was it in place BECAUSE of those occurrences? An inquiring mind is an unhired mind.
Laume (Chicago)
Myers Brigg doesnt ask about “dark thoughts” or “emotional issues”. It measures introversion, extroversion, sensing and judging. However, its still not really an evidence based test.
Jim Muncy (Florida)
Sheesh! Could we make modern life more complicated! Now applying for and getting a job requires moving heaven and earth: court litigation, new laws, mandates, federal oversight, involved and prolonged heated confrontations, weeping and gnashing of teeth. I guess I'm out of patience and out of sync with modernity. In my experience, peace of mind is found in the simple life. Good luck, eh?
Max (Moscow, Idaho)
@Jim Muncy you make it sound like the problem is employees being overly demanding. Also, who on earth is crying or gnashing their teeth in this article? What on eaeth do you mean by "the simple life"? For goodness sake, this is a well constructed argument explaining the pervasiveness of illegal hiring practices when automated software is used for making hiring decisions. The problem is that these software solutions unconsciously carry the bias of their writers (software is written by humans) or the biases of the training data used to build predictive models. We all just want jobs, to support ourselves, our families and to give our lives meaning.
Jim Muncy (Florida)
@Max No, I'm fully on the workers' side. Employers have the power to toy with them and make even applying for a job a nightmare. Why not check on a few background things, go with your intuition, and hire them? Most jobs have a 90-day trial employment period. If they work out, great; if not, let them go and try again. If they interest you, give them a chance. Try it before you buy it. Is it that hard? Need both sides retain lawyers for this common transaction?
Kathleen L. (Los Angeles)
Agree. This article is like so many others: assumes that the point of view that matters is the employer’s. I say the real problem is in trying to find an algorithm to replace the tedious and costly process of human interaction. The process of online job applications is a large part of the problem, because a single entry-level job may generate a thousand applications. It leads the employer to a sense of entitlement, that a perfectly qualified person might not be good enough for this thousand-to-one shot. It leads the applicants to a sense of hopelessness and despondency, particularly when hundreds of applications are completed, each one painfully scrutinized, and never a single human response. It’s cruel and unfair, and does nothing to open doors to traditionally marginalized people.