When AI Discriminates, Who’s to Blame?
When AI misbehaves, who is to blame? Is it the vendor who created the tech or the employer who uses it to hire?
It’s a question that’s been top of mind in recent years as more and more companies enlist the help of artificial intelligence to process massive amounts of applicant data quickly and efficiently. In the court of public opinion, employers and vendors volley responsibility back-and-forth, with some employers claiming the “black box” nature of AI makes it difficult to assess the products they purchase while vendors argue the onus is on employers to incorporate AI properly into the hiring process.
In actual courts of law, the question of responsibility has appeared more clear-cut.
Over the past two years, The Equal Employment Opportunity Commission released guidance on AI’s use under the ADA, sued “ITutorGroup” for its alleged improper screening of candidates based on age, and, most recently, incorporated AI governance into its draft strategic enforcement plan for 2023-2027. In each instance, the agency clarified that the employer bears most — if not all — of the liability for discriminatory AI.
Ahead of a Jan. 31 hearing on the draft strategic enforcement plan, Jiahao Chen, owner of consultancy Responsible Artificial Intelligence LLC, which provides compliance and risk assessment for the use of AI in hiring, expressed frustration at this, arguing that more should be done to hold vendors legally accountable.
“At present, it is unclear if these vendors have any anti-discrimination compliance obligations,” Chen wrote in public comments.
Are HR Tech Vendors 'Employment Agencies'?
Less than one month later, Derek Mobley, a 40-year-old, Black worker who suffers from anxiety, filed a first of its kind lawsuit in the Northern District of California against HR tech company Workday, challenging the assumption that vendors are off the hook for problematic tech.
Mobley alleged that the use of AI (presumably a resume screener) had resulted in him being passed over for between 80-100 jobs on the basis of race, age and disability. Mobley argued he was passed over for jobs based on Workday’s administration of the screening products that constituted “a pattern and practice of discrimination,” despite his holding a bachelor’s degree in finance and an associate’s degree in network systems administration.
The case was similar to the claim filed against iTutorGroup — but with one huge catch. Mobley wasn’t suing any of the individual employers that rejected his application: He was going after the company that he said improperly screened him out.
Workday representatives said in an emailed statement to Reworked they believe the lawsuit is “without merit.”
“At Workday, we’re committed to responsible AI. Our decisions are guided by our AI ethics principles, which include amplifying human potential, positively impacting society, and championing transparency and fairness,” the representative said.
According to Pauline Kim, professor of law at Washington University in St. Louis, should Mobley's suit go forward, it will be a test case of a novel legal theory — that tech companies might be considered, in some cases, to be “employment agencies” that can be liable for employment discrimination claims under Title VII of the Civil Rights Act of 1964.
Should the theory hold up, our perceptions of liability in this nascent space could be turned on their head — upping the stakes for vendors who have so far been hesitant to publicly disclose information about their algorithms.
Related Article: Is Your Recruiting Algorithm Biased?
When I got on the phone with Joe O’Keefe, labor and employment partner at Proskauer Rose, he started with “the obvious” facts at hand.
“What we're trying to deal with here is a statute and regulations that were … not designed to meet the current problems,” O’Keefe said.
As the name suggests, the Civil Rights Act of 1964 was written prior to the proliferation of HR tech and AI. Though in 2022, 92% of companies that used AI worked with a vendor to procure it, the laws that govern employment were drafted before hiring became dependent on an ecosystem of connected actors. Legally, Kate Bischoff, HR consultant and Minnesota-based attorney said, we haven’t caught up: Plaintiffs can only sue “employers” or “employment agencies.”
Employment agencies are defined as “any person regularly undertaking with or without compensation to procure employees for an employer or to procure for employees opportunities to work for an employer.” Kim, who testified before the EEOC on the subject of tech companies as employment agencies in January of this year, said that there isn’t much case law to clarify whether intermediaries might be considered an employment agency, as Mobley claims, because the question hasn't come up until recently.
“I think for a long time, everybody knew what an employment agency was, right?” Kim said. “And now we're in a world where we have these new platforms … that are kind of a little ambiguous. ... And so I think that's really an untested question.”
How McDonald’s Drove Productivity Through an Elevated Employee Experience
In the new remote/hybrid workplace, work/life boundaries are blurred and workplace stress is a top driver of mental health needs.
How to Future-Proof Your Employee Experience Strategy in 2023
A framework to navigate through economic uncertainty
Challenges to Efficiency in 2023: Your Employees Need the Digital Workplace of the Future
The era of asking employees to do more with less is upon us
The Essential Role of Communicators in Fostering Wellbeing in the Digital Workplace
Join us for practical insights on how digital communicators can support employees to thrive in the digital workplace
Addressing Employee Needs and Wants with a Digital Workplace
The workplace is getting more and more digital – both in how we work and where we work
Maintaining a Human-Centered Approach During Digital Transformation
When it comes to digital transformation - people drive change, not technology
Still, she thinks there’s reason to believe that the limited case law suggests some vendors will qualify as employment agencies. (She has no comment on Mobley’s exact complaint given she’s unfamiliar with Workday’s platform.)
“[Job platforms] self-described function is to find 'top talent' to meet their clients’ staffing needs. By definition, they are 'procuring employees' for an employer on a regular basis,” Kim wrote in her 2019 paper, “Manipulating Opportunity.”
The EEOC appears to be interested in this theory of liability. During the January hearing, commissioner Andrea Lucas engaged in a several minute back and forth with Kim about the textual basis for considering vendors employment agencies under Title VII.
When asked directly whether tech vendors might be considered employment agencies under federal law, a spokesperson for the EEOC said by email that decisions were “fact-specific and … done on a case-by-case basis,” but “if the pre-screening or any other actions performed by the tech vendor qualify as regular procurement of prospective applicants or employees for an employer, the tech vendor may be a covered employment agency.”
Whether the judge in Mobley’s case agrees remains to be seen. Kim stressed that in this case “there are a lot of unknowns here” — both legally and around the tech.
Related Article: Artificial Intelligence in HR Remains a Work in Progress
A Wake-Up Call for Tech Companies and Employers
Though this case is pending — and likely will be for some time — Bischoff is hopeful that its existence is an “eye opener for tech companies.”
“For so long, they've been able to do what they're doing, thinking everything is going great, and then having no liability or no exposure,” Bischoff said, adding, “[this] will hopefully turn on the conscience and the desire to do better — to not put their clients or their customers at risk.”
Paradoxically, the complaint may also be a wake-up call for employers who were already on the hook for AI to begin with and might not have needed another headache. Savanna Shuntich, attorney at Fortney Scott, told me if employers didn’t thoroughly vet vendors, they could be served next. She worries that the real target in this case might not be Workday at all. It might be an attempt to get their clients’ data.
Bischoff agreed, quipping: “If I'm [a] plaintiff's attorney, and I know that this piece of technology is discriminating, I just need to look at [Workday’s] website to fund the education of all my grandchildren for the future … I just need to copy and paste that complaint several times.”
The best way for companies using AI to defend themselves, Kim said, is to understand the tech they’re purchasing. She pointed to the transparency movement that’s begun in New York City and the EU and spreading to New Jersey and California as a positive step.
“A really important step in terms of being able to hold entities that are creating these [tools] responsible is to be able to have a better understanding of exactly how they work,” Kim said. “In other words, I think greater transparency is a really important piece of addressing the problem of algorithmic bias.”
About the Author
Susanna Vogel is a New York City based freelance reporter who’s been covering the workplace and labor beat since 2021. Her favorite stories explore technology, DE&I and workplace culture. Connect with Susanna Vogel: