As companies mandate the use of artificial intelligence, they often set parameters for how frequently employees should use the technology. But to get value from AI, they should also focus on how employees use it, and whether it’s truly solving complex problems and reducing their workloads.
For instance, tax, audit and advisory firm KPMG had “really great adoption rates” for AI across the organization, but in some cases, “the day-to-day wasn’t feeling that different,” said Edwige Sacco, head of workforce innovation at KPMG. The company focused on measuring AI use by volume but recognized it needed to go deeper, she said.
To address the challenge, KPMG and the University of Texas at Austin McCombs School of Business conducted a study analyzing 1.4 million AI interactions from the firm’s back-office operations to identify differences between sophisticated and routine AI use. They found roughly 5% of users consistently demonstrated this behavior, which helped the company create a blueprint for what effective AI use should look like.
The study, published in March in Harvard Business Review, found that the most successful AI users applied AI across their work, used AI to solve problems and directed how AI models approach tasks. Moreover, they weren’t the employees who used the technology the most or had top-notch technical skills.
What Is ‘Sophisticated’ AI Use?
The research identified sophisticated users as employees who treat AI as a “reasoning partner,” directing how the model approaches problems by asking it to assume certain roles or perspectives and offering examples. They demonstrate how AI “reasons through a task,” asking it to explain its responses and providing feedback. These users didn’t simply accept AI’s output or view it only as a productivity tool.
Thinking of AI as “a transactional exchange” doesn’t bring value to a role, Sacco said. What does is viewing the technology as a partner or collaborator.
“Our job is to guide its thinking,” Sacco added. AI users should be “constantly iterating” and learning with the technology. Context matters, too. AI should be embedded into an employee's workflow, not add more work, she said.
AI-Compatible Skills and Knowledge
“In all the excitement around AI at work, what is often overlooked is that knowledge has many dimensions,” said Dirk Lindebaum, professor of management and organization at the University of Bath.
In a separate study led by Lindebaum and published in the Human Resource Management Journal in February, researchers showed that AI is most compatible with knowledge related to procedures, data, processes and routine.
AI is “excellent at processing information,” Lindebaum said. But the technology is less able to reproduce knowledge related to practice, hands-on experience, shared values or judgment, which is often developed through lived experience, according to the study. This suggests that AI use sometimes distances employees from real-world learning experiences, he continued.
For instance, an employee may turn to AI to automate policy updates or training manuals, Lindebaum said. But, “if this comes at the expense of worker familiarity with these tasks and processes, workers may be ill-equipped to fix any problems in the future.”
Employees first need to be familiar with their tasks before outsourcing them to AI, Lindebaum said. His research suggests that HR leaders should balance AI efficiencies with “mentoring, shadowing and reflective first-hand experiences.”
How to Turn Employees Into Sophisticated AI Users
Companies typically have a “spectrum” of AI users in their ranks, said Maruf Ahmed, CEO of IT and workforce consulting firm Dexian. Some employees use AI just to use it, he said. Others strive to learn as much as they can about the technology, with an eye towards how it improves their work quality, productivity and benefits the organization.
Here are some tips for steering your teams toward becoming sophisticated AI users:
Explain the 'Why'
Make sure employees understand the reasons for introducing AI tools, including how they’ll be used, Ahmed said. For example, in HR, explain to recruiters that AI speeds posting job descriptions and screening applications, allowing them to focus on interviewing candidates and building relationships. This removes the fear of AI and builds trust, he said.
A lack of clarity on why employees need AI creates risk, Sacco said. If someone doesn’t understand the purpose of an AI tool or process, they may be less likely to check an AI's output before using the information.
This is a problem, Ahmed agreed. A recent survey of more than 1,000 employees by Resume Now found that 35% of workers “rarely” or “only occasionally” review AI-generated output before they use it.
This might not be because someone is lazy or malicious, but because “they haven’t figured out what’s in it for them,” Sacco said.
Redesign the Work First
Avoid asking employees to layer AI on top of their regular work, Sacco said. “There has to be a deliberate effort to actually look at the workflow and figure out how the work itself has to be redesigned, so we’re not just adding AI on top of a broken or inefficient process.”
AI is often touted for alleviating repetitive tasks, freeing up workers for creative work, Lindebaum said. But research has suggested that as time is gained, more work is piled on.
“Not being attentive to this difference, in my view, is one mistake that organizations should avoid,” Lindebaum said, adding that he expects more research should reveal a “much more nuanced picture about the benefits of AI at work, compared to enthusiastic advocacy of AI we see right now.”
Define and Reward the ‘Right’ AI Behaviors
Organizations must define what sophisticated AI use means for them, including the tools they're adding and the goals and problems they’re trying to achieve, Ahmed said.
Ahmed’s company created an internal AI steering committee so employees across the organization use the same AI tools to solve business problems, rather than “just being there for the sake of productivity.”
KPMG is “very focused on changing expectations around AI,” Sacco said. Instead of just offering a tool to employees, organizations “have to tell them what it means to use it well in your job.”
The goal is to reward quality AI use over quantity, Sacco said. “We have to change the expectation of what ‘good’ looks like so we're incentivizing the right behaviors and building a culture that rewards what matters to us, which isn't just more work, more hustle.”
Protect Human Capacity
Creativity and critical thinking should be protected with the rollout of AI, Lindebaum said. He suggested creating “learning vaults” to help employees maintain their “reflexive and experience-based knowledge essential for human capital.”
For instance, create a workplace where employees must learn the basics of a task, process or routine before offloading it to AI. Otherwise, it “undermines the ability of workers to provide informed answers about these questions,” Lindebaum said.
This approach helps employees maintain human-centric knowledge and become adept at adapting to new challenges and finding new solutions, Lindebaum said.
Train and Upskill
KPMG launched a firmwide training effort to help employees build more sophisticated AI skills and behaviors.
“We moved AI exposure into the work itself,” Sacco said. “It was the idea of bringing learning closer to the flow of work.” This involved providing AI tools and showing employees how to use the tools to help finish their daily work.
Dexian partnered with an outside organization to provide training to upskill and reskill employees on AI literacy, Ahmed said. Now, the company offers role- and responsibility-specific AI training.
Training helps employees feel secure using AI, including how it fits into their work and fulfills a company's purpose, Sacco said. “People need certain things to be comfortable,” she said. “They need structure, they need more direction. Not everyone wants to play and experiment at work.”
Editor's Note: What else can HR leaders do to protect and increase human capabilities in the midst of the AI rush?
- Digital Literacy vs. AI Literacy: Why Your Organization Needs to Know the Difference — Digital literacy and AI literacy aren't the same thing. Yet organizations are building one and calling it the other.
- From AI Ambition to ROI: Where Leaders Are Getting Stuck — Most companies are experimenting with AI, yet few are profiting from it. The gap isn't about technology. It's about how humans lead and decide.
- Waking Up to Our Power: Digital and Human Capabilities for a Future-Ready Workforce — Beyond technical know-how, the future-ready worker needs a new blend of human and digital skills — anchored in awareness.