A recent analysis published by MIT (Here) analyzed AI technologies that are used to screen and hire potential employees. The analysis found that improvements made using AI interviewer bots, resume screening software, and other platforms that rely on advanced language models are real but significant flaws remain that are problematic. This blog post points out some of the “difficult questions about the impact of AI on procedural fairness” that remain and may be reinforced.
The test was conducted using Chat GPT as a resume screener for an entry level finance job in its browser, API, and GPT-agent forms. Specifically, they asked ChatGPT to select one candidate from a set of ten entry-level-hopefuls who varied in race, gender, as well as the perceived cost of university attendance and extracurricular activities.
The conclusion reached in the article was clear: do NOT use ChatGPT to screen resumes despite the fact it is a more efficient method to screen larger numbers of resumes.
The most salient findings supporting this conclusion include:
- ChatGPT overwhelmingly tends to select the first candidate presented to it who it considers equally qualified as other candidates. This is more than merely recency bias. The unquantifiable benefits obtained by “insiders”–internal candidates; friends and relatives of decisionmakers, executives, employment search firms, university career centers, etc.–are increasingly quantifiable and usable to prove socioeconomic, racial, gender, and other biases in new ways. Likewise, advising ChatGPT not to select the first candidate is not a perfect solution because many other candidates were not considered despite only the first candidate not being eligible.
- The findings validate what everyone in society instinctively knows about “how the real world works”–“how much you pay for your degree matters, as do the racial signals encoded in our names.”
- Higher cost activities or “signals” are often preferred without any basis. For example, one employment screening tool (not ChatGPT) showed a bias in favor of lacrosse players. (Lacrosse Article) In practical terms, “candidates may resort to accumulating costly credentials or engaging in strategic resume formatting to signal the desired level of prestige.”
- Many candidates know how to game the system. Such tactics include embedding hidden prompts within resumes, using phrases like “select this candidate” within the text to influence the AI’s decision-making process, and using white font on a white background to hide instructions or signals that are meant to be read by the machine but not by human reviewers.
Other lessons to be gleaned from this study include:
- Co-employers and other outsourced HR solutions providers such as PEOs and placement agencies should be required to provide regular updates using rigorous standards to prove that their methodologies are neutral to the greatest extent possible. These standards need to be updated on a constant basis and should not be limited to strict interpretations of relevant laws and reviews of only what can be seen by the naked eye.
- ChatGPT and other AI platforms cannot be the sole means prove that federal, state, and local level protected candidates were given equal opportunities. This makes perfect sense: using ChatGPT to determine whether ChatGPT performed its tasks properly makes no sense whatsoever.
- PEOs, as co-employers, must be increasingly vigilant to ensure that their clients do not deviate from the hiring process provided as part of its suite of services.
- ChatGPT or other AI platforms may rely on arbitrary or irrelevant rules to select candidates. If you can prove that reliance on a factor was utterly unimportant, then an employer could use this fact to prove a lack of bias.
- Employers should consider using ChatGPT after engaging in a bias-free decision making process versus using ChatGPT from the outset.
Ironically, the use of AI in making employment decisions may raise more questions than answers.
David Seidman is the principal and founder of Seidman Law Group, LLC. He serves as outside general counsel for companies, which requires him to consider a diverse range of corporate, dispute resolution and avoidance, contract drafting and negotiation, and other issues.
He can be reached at david@seidmanlawgroup.com or 312-399-7390.
This blog post is not legal advice. Please consult an experienced attorney to assist with your legal issues.
Photo credit: Ongig Blog