lifeguard with back to camera, watching a large number of people swimming in a pool
Feature

Generative AI Writing Job Descriptions: Adult Supervision Required

5 minute read
Lauren Dixon avatar
By
SAVED
Generative AI tools can help talent acquisition teams write job descriptions, but the results still need human review and edits.

Despite its shortcomings, generative AI tools like ChatGPT have the potential to change the landscape for business, mainly by automating labor-intensive — and let’s be honest, usually boring — tasks.

For HR leaders, one such task is writing job descriptions. Or as Dr. Terri Horton, work futurist, HR consultant and advisor at FuturePath LLC puts it, "Writing job descriptions is really a drag." And she’s not alone.

The task can be repetitive and self-serving. Oftentimes, recruiters are asked to fit all of the ever-so-important company information in the description, with the added challenge of doing so without making it so lengthy that it loses the job seeker. 

But ChatGPT and other generative artificial intelligence tools can help refine that process and write job descriptions with a high level of intentionality that clearly communicate the employer brand and culture, Horton said.

With these benefits and the fact that ChatGPT is an easily accessible resource at no cost, it’s no surprise that talent leaders are ready to use the tools. According to Paychex’s 2023 The Pulse of HR report, which is based on a survey of 1,350 companies, 76% of HR leaders among organizations with more than 20 employees said they plan to use AI within the next year, and particularly for identifying potential candidates via AI-generated job ads. 

When it comes to productivity and saving time, generative AI is “absolutely a powerfully remarkable tool,” Horton said.

But rather than rely on generative AI to do all of the writing of job descriptions, sources for this story suggest that it be used as first draft, letting the technology pull skills and capabilities, before reintroducing a human touch to take it to the finish line.

Bias Among the Bots 

Bias is a significant concern when using generative AI because it is based on large language models (LLMs) that use existing phrases to generate new text. 

“We know that bias exists in our world, and we cannot fully eliminate it, and we know it proliferates in online spaces,” Horton said. Because it pulls so much from the internet, there’s an opportunity for bias to seep in, either clearly or mystically.

“The content generative AI creates is only as good as the LLMs that power it,” said Sarah Tilley, SVP of global talent acquisition and development at ServiceNow. 

Ideally, the data used to train the AI should include a wide range of examples from diverse demographics, cultures and perspectives, Tilley said, because bias can emerge when training data fails to reflect the full diversity of the real world. But when using publicly available AI, such as ChatGPT, the user has no control over the training data. The companies using these tools therefore must do their due diligence to understand the potential of these tools to introduce bias into its results.

For talent acquisition teams looking to use generative AI for job descriptions, the best practice advice is to use the technology to create a draft of the description, pulling in skills and capabilities, before handing it over to a human to take it to the finish line. In other words, carefully review and edit the AI’s output to ensure the content not only accurately represents the role, responsibilities and capabilities, but also that there is no bias or other inadvertent discrimination built into the copy, Tilley said. 

Horton said vigilance is necessary at this stage because biased language presents itself in job descriptions in blatant and covert ways, creating an opportunity for the employer to miss qualified candidates. This is the case whether or not generative AI is in use.

For example, language about commitment and intensity of the job could appeal more to men than women, as well as people with families. Descriptions that use wording such as “strong” or “very competitive” might also appeal more to men, she said. 

Education is another area where bias perpetuates. If the organization prefers candidates who come from top-tier institutions, talent acquisition leaders should consider the demographics of those institutions and how that can play out in the candidate pool, Horton said.

Similarly, if a company prefers to hire candidates from particular industries, functional areas or experience, they should consider the makeup of those groups. For example, a given industry might have a low percentage of women, so pulling from that industry is likely to perpetuate a homogenous workforce.

“All employers are required to adhere to employment laws as it relates to discrimination,” Horton said, whether using generative AI or not. Organizations are responsible for making sure that the description is not violating race, age or religious bias regulations.

Related Article: Is Your Recruiting Algorithm Biased?

Rules and Regulations 

Laws regarding the use of generative AI nationally are in the works, but some local regulations already exist. In New York City, for instance, employers who use AI in their hiring processes must let candidates know that automation is in use, and companies must have an outside auditor check annually for bias. 

“Creating guidelines and protocols around the usage of generative AI to protect the organization and to protect candidates becomes really important,” Horton said, adding that relevant use cases are helpful. 

Employers should also prioritize privacy and confidentiality for candidates, as well as mitigate any risk around exposing organizational proprietary information.

Considering the speedy rate of adoption of GAI among corporations, regulations are evolving rapidly, so it’s important to work closely with legal, compliance and technology teams to understand the restrictions and ensure generative AI practices comply with applicable laws and the latest principles of responsible AI development, Tilley said. 

Learning Opportunities

“Additionally, given the sensitive information involved in HR, the use of generative AI to support any function within human resources needs to be handled with care,” she said. “Companies should balance their excitement around efficiency with proper governance, review, policies and controls.” 

Related Article: NYC's New AI Bias Law Is in Effect. Here's What It Entails

What Generative AI Can and Cannot Do

Generative AI, like many technologies, is a tool. “I like to think of generative AI as a recruiter’s sidekick,” Tilley said. It can help free up recruiters’ time to focus more strategically on areas such as cementing employer brand, building relationships with candidates and conducting better workforce planning. 

While some business leaders might see AI as a way to reduce headcount, Horton advises that the technology has yet to reach that level of sophistication. 

Really good generative AI “can only deliver about 80-85% of what you need. They still require the human to add the nuanced context that’s relevant to the organization,” she said. “It is not prepared to act solely on its own to replace the human capacity required for these roles.”

The goal should be to invest in these systems that enable more strategic outcomes. “It’s one of the most transformative technologies we’ve had in generations, and it will completely transform business across all functions,” Horton said. 

So, talent acquisition professionals should focus not only on how to leverage generative AI in their role, but they should also understand how it’s being used across other areas of the organization, so they can better support hiring practices — including job descriptions — in those functions.

About the Author
Lauren Dixon
Lauren Dixon is a Chicago-based freelance writer, editor and copy editor with nearly a decade of experience writing about talent management and leadership. Her work has appeared in Reworked, Chief Learning Officer and LoganSquarist, to name a few. Connect with Lauren Dixon:

Main image: adobe stock
Featured Research