Algorithms Have a Place at Work. So Does Transparency
The challenges of developing the workforce of the future is pushing nearly every organization to turn to AI-driven tools for help. Some are looking for ways to drive change among the change-resistant. Others seek new ways to increase organizational performance.
Meanwhile, employee needs and market demands are in a constant state of flux, too.
AI can help, right?
I write this with 100% certainty: absolutely. AI and the algorithms have a place at work. But it’s best to know what you’re getting into before jumping in.
The Flip Side of AI
I’ve enjoyed taking briefings with work technology vendors at the forefront of developing some of the most powerful, yet transparent, AI in the world. No black boxes where data goes in and comes back out without explanation.
Some of these technologies can only be described as compelling discovery and personalization engines. Others can comb through data and highlight trends that would have been impossible to discover just a few years ago without a full-time data scientist.
But as powerful as these tools may be, use of them may come at a cost. And there are already some troubling examples out there.
For example, HR leaders showed an uncomfortably high willingness to depend on algorithms for deciding who gets laid off during a downturn. Others may look at the manipulative algorithms behind social media giants that have proven to be bad for our youth’s mental health as an inspiration for work technology that keeps people engaged and performing at work.
Even gamification, which isn’t necessarily dependent on algorithms to work effectively, has been scrutinized heavily — as in the case of Duolingo. When it feels like the game part of the gamification is more important than what you’re trying to accomplish, something has gone wrong.
Related Article: Ready to Roll out Generative AI at Work? Use These Tips to Reduce Risk
Good Intent Only Goes So Far
Yes, the powerful AI, algorithmic and gamification engines that drive people to doomscroll or spend a bit more time in an app than they intended could also help people do better at work. And the intent may be good. Employees generally want to do good at work. Having opportunities to get feedback and develop is one of the top asks employees make of their employers.
But the fine line between effective and manipulative is quickly becoming an issue. As algorithms grow more sophisticated — and it only seems to be accelerating — there’s a point where its predictive and influencing actions threaten the very autonomy of employees.
When it feels like you’re being coerced into something, rather than guided into an action that you would make yourself, it doesn’t feel great. If you’ve ever played an addictive game where you might have gone to bed later than you wanted, you’ve definitely experienced that feeling.
While a few hours of missed sleep may make you regretful — or sleepy — in the morning, how employers handle their people at work has far greater consequences. Algorithms and AI that turn manipulative can make employees lose trust in their employer.
Your workforce may also resist other changes if they were caught blindsided by the negative consequences of the suggestions they get. You could inadvertently turn your organization into a place that is slower to implement new initiatives or make improvements to performance, for instance.
Ultimately, an organization that is more willing to take risks and be among the first to try new innovations could backfire spectacularly.
Learning Opportunities
Related Article: When AI Discriminates, Who’s to Blame?
Be Transparent and Build Trust
So, if not AI, then what? What's the alternative? Organizations can’t just ignore everything AI.
If your organization is committed to delivering a great employee experience while pushing the boundaries of innovation, then you need to be transparent with employees. Treat them like adults, let them know what AI and algorithms are and aren’t affecting, and even let them opt in or opt out, if possible.
Listen to employee feedback about what is and isn’t helpful to them, as they, too, adjust to new technologies. First impressions don’t last forever, but they are important.
Leaders should also be mindful of building trust with employees, particularly through times of tremendous change. Nobody wants to be manipulated, especially to do something they wouldn’t already want to do. If you think being micromanaged is unpopular, just wait until an algorithm is doing it. Whatever innovation path your organization is taking, creating avenues of communication where people feel safe voicing concerns and being intentional in responding is critical during this time of change.
Of course, companies will have to figure out the right balance for themselves. Some will choose to take things more cautiously, and others will go faster. Regardless of the approach, taking time to value employee autonomy and trust in your actions will prove to employees you care.
Related Article: You Can't Automate Managers, No Matter How Good the Tech
Making the Future of Work Manipulation-Free
The future of work is promising. AI and algorithms hold tremendous opportunities for making working people’s lives even better.
But any new technology always has downsides, though some are more consequential than others. Organizations must do whatever they can to not fall into the trap of employee manipulation and coercion via this new technology.
The future of work has to be a place where employees can trust their employers to treat them with respect and value their autonomy. If we act intentionally, with prudence, we can move together to build a future of work that values human contributions and is transparent in how it uses technology for the benefit of employees and organizations alike.
About the Author
Lance Haun is a leadership and technology columnist for Reworked. He has spent nearly 20 years researching and writing about HR, work and technology.
Connect with Lance Haun: