What You Should Know About Upcoming AI and Automated Employment Tool Rules
With a growing remote workforce, many companies now hire employees virtually, without meeting them in person. The HireVue 2021 Global Trends Report found that 54% of hiring leaders said virtual interviews resulted in a speedier recruitment process, and 41% said it helped them identify the best candidates.
The proliferation of more advanced HR tools has also helped make remote hiring a very attractive approach for employers and recruiters. AI-based tools are said to enable better sourcing and screening of candidates, as well as greater engagement and easier communications.
"Artificial intelligence (AI) and analytics have gained popularity among hiring teams as an effective way to accelerate the hiring of qualified candidates at scale," said Eric Sydell, EVP of innovation at HR software firm Modern Hire.
A 2022 Zippia study found 94% of hiring professionals who use an Applicant Tracking System (ATS) say the software has improved their hiring process, and 68% agree that the best way to continue improving recruiting performance is to invest in new technology.
The process is called automated hiring, and while it's gaining traction at a rapid pace, there are also serious considerations to keep in mind before jumping in.
With so much buzz surrounding automated hiring, it didn't take very long for legislators to examine the risks of using AI in the recruitment process. New York City, for instance, has passed a law that requires that AI and algorithm-based recruitment and HR technologies be audited for bias before being used.
A bias audit requires the hiring company to ensure that the tool provides an impartial evaluation by an independent auditor. At a minimum, they will need to test whether the tool can provide a balanced assessment of all candidates.
But there are other aspects to the law, said Angela Preston, associate general counsel, corporate ethics and compliance at New York-based background check provider Sterling. Regulations being passed in the New York area also prohibit the use of such tools unless candidates are notified, she said.
There's still a lot of questions surrounding the new rule and how employers can ensure their compliance. An article published in Harvard Business Review discusses the privacy risks associated with using AI in recruitment.
"While an employer may not violate any laws in merely discerning an applicant’s personal information, the company may become vulnerable to legal exposure if it makes adverse employment decisions by relying on any protected categories such as one’s place of birth, race, or native language — or based on private information that it does not have the right to consider, such as possible physical illness or mental ailment," the authors wrote.
UK job search firm Total Jobs analyzed 77,000 job postings on the company's website and found nearly 500,000 instances of gender bias. According to their findings, the average job posting had six male-coded or female-coded words.
While many such biases can be believed to be unintentional, it's no surprise that states are looking into regulations that can help minimize the risks associated with the automated hiring process.
Related Article: The Impact of AI on Privacy
So, What Should Companies Do?
The situation is still evolving, and the most important thing leaders, recruiters and HR teams can do is to stay informed and get expert counsel if needed. Changes are evidently coming, and compliance is likely to be scrutinized for some time.
More particularly, Preston said organizations that use AI and automated tools to help fill gaps in their workforce should keep an eye on the following items:
Note How Automated Employment Tools Are Defined
Laws are prescriptive, and definitions matter. Preston said it's critical for leaders to understand how terms like "automated employment tools" are defined to get a clearer sense of how it applies to their use cases.
How McDonald’s Drove Productivity Through an Elevated Employee Experience
In the new remote/hybrid workplace, work/life boundaries are blurred and workplace stress is a top driver of mental health needs.
How to Future-Proof Your Employee Experience Strategy in 2023
A framework to navigate through economic uncertainty
Challenges to Efficiency in 2023: Your Employees Need the Digital Workplace of the Future
The era of asking employees to do more with less is upon us
The Essential Role of Communicators in Fostering Wellbeing in the Digital Workplace
Join us for practical insights on how digital communicators can support employees to thrive in the digital workplace
Addressing Employee Needs and Wants with a Digital Workplace
The workplace is getting more and more digital – both in how we work and where we work
Maintaining a Human-Centered Approach During Digital Transformation
When it comes to digital transformation - people drive change, not technology
Definitions also differ from state to state, and businesses that operate in more than one state may need to speak to legal advisors to make it through the maze of regulations. Some companies may prefer to adhere to the strictest law or the narrowest definition to help protect the company.
Keep up With Federal Guidance
Similarly, employers need to ensure they are familiar with federal guidance, Preston said. There is more to this than state-specific rules and regulations. Operating in a state that does not yet have such laws may still require employers to keep an eye on how the situation is evolving at the federal level.
Related Article: Why Responsible AI Should Be on the Agenda of Every Enterprise
Identify and Stress-Test Automated Hiring Systems
Preston recommends companies that use AI in their HR processes complete an audit of the hiring systems to identify possible automated screening tools using the definitions in the NYC law.
It is also wise to review the tool and seek out any potential bias that may have been introduced inadvertently. Preston advised: Consider whether you should conduct bias testing and whether it is legally required, and document any testing that is done.” Recording the bias tests helps protect the business should there be a need to prove that efforts were made to eliminate bias and comply with guidelines.
The same tests should also be conducted with an eye on privacy.
Consult With Legal Aid
Businesses that use or are looking to implement AI and automation tools in their HR processes should consult with legal counsel if at any point unsure about their level of compliance with the new rules.
Even if operations are not in states where such rules have been enacted or discussed, it's clear that they are coming, and it's best to be prepared.
Ensure Candidates Know About AI Use
If this isn't yet part of the process, Sydell said employers should ensure all candidates know early in the application process that their data may be put through AI or automation techniques. An even better option would be to also offer candidates the ability to refuse to have their data processed by AI, without risk of discrimination for having opted out.
A data audit should also be conducted to find out what information is required for the position and only collect the information essential for processing the application, Sydell said.
Employers are turning to technology to assist in their hiring practices; it would be a shame if they used technological advancements to gather sensitive information for purposes other than those stipulated to candidates and employees.
About the Author