Experimentation Is the Future of Enterprise Learning
To put it mildly, there's been some recent discussion about how work – and therefore employee development – has changed and will continue to change.
To be clear, this is a good thing: The future of employee development cannot rely solely on the tried-and-tested, instructor-led training methods that learning and development functions are so familiar with. Thankfully, L&D functions are finally recognizing this fact.
The past two years have helped learning leaders see that the most familiar methods haven't kept pace with the way learning happens in organizations today, let alone help organizations get ahead. As a result, we're seeing L&D functions venturing outside their comfort zones to experiment with all kinds of learning methods.
And there are lots and lots of learning methods to choose from. Many of them are familiar but they're gaining newfound appreciation and utilization. Our research, summed up in this infographic, identified 66 separate learning methods (Fig. 1) that enable employee development. We're sure there are more.
Addressing Employee Needs and Wants with a Digital Workplace
The workplace is getting more and more digital – both in how we work and where we work
Maintaining a Human-Centered Approach During Digital Transformation
When it comes to digital transformation - people drive change, not technology
The Evolution of Employee Recognition
Leveraging the power of appreciation to improve the employee experience
How to Build a More Innovative and Resilient Workplace Culture
What would happen if every member of your team came to work focused on finding solutions and creating better results?
But here's the thing: The learning methods that work for an organization will almost certainly change over time. Needs change. Workforce skills, preferences and characteristics change. The methods that work today may not work next week, next month or next year.
That means finding the right mix of learning methods is never a one-and-done thing. It requires continual assessment and revision. This is a huge shift for many L&D functions from linear, waterfall approaches to iterative cycles of experimentation, trial and error.
Experimentation Means Being Willing to Let Go
There’s an emotional aspect to this shift toward experimentation and iteration. Experimentation requires a willingness to identify and let go of methods that aren't working in order to make room to try new ones.
But L&D teams often work hard to implement certain learning methods. Think how much effort it can take to create a course, for example. Those methods are often successful for a while, but when they stop working (or don't work from the start) there is natural resistance to removing them from L&D's offerings. Letting go can be difficult.
Recently, we at RedThread Research spoke with dozens of learning leaders as part of our research on the future of employee development and, more specifically, the future of learning methods. These leaders helped us understand how they're implementing systems and processes to make it easier to experiment.
The leaders we talked to mentioned five specific techniques they use to help their L&D functions let go of what's not working. They are:
- Make learning methods disposable.
- Gather lots of data.
- Implement strategic pauses.
- Delegate offboarding decisions to business functions.
- Have an exit strategy.
Here's a look at each technique.
Make Learning Methods Disposable
Some leaders said they view learning methods as disposable or short-lived. Everything has a lifecycle, they said, which means it’s natural to offboard things that no longer work. This attitude makes it emotionally easier to cut out what’s not working, even if an L&D team worked hard to create it.
This technique of making methods disposable also helps with planning. Contracting, sunk costs, culture, inertia and a host of other factors can pressure organizations to keep learning solutions alive even if they’re not working or aren’t used. This technique puts the expectation and processes for letting go in place from the very start.
Related Article: 4 Ways to Focus Employee Learning
Gather Lots of Data
When a learning method isn’t working, L&D functions with an experimentation approach see that as good data, not a failure. One participant in an October 2021 research roundtable told us: "Get excited about all the data you can collect and the insights you can generate from testing what’s working and what’s not."
The leaders we spoke with recommended gathering as much data as possible, both qualitative and quantitative. They collect qualitative data like feedback from employees and their managers. They also gather quantitative information like usage, click and other data that can be gleaned from the tech platforms in organizations.
Leaders said they look for correlations between business metrics and more traditional L&D metrics to see if a learning method is working. Implementing processes to routinely collect and analyze learning and business data can help make the case for keeping, reworking or letting go a given method.
Implement Strategic Pauses
Sometimes it can help to simply put distance between the realization that a learning method isn’t working and the final act of letting it go.
To create that distance, another leader at the research roundtable said her organization implements strategic pauses. These pauses allow the organization to offboard learning methods that may not be working without encountering as much resistance.
She said: "If something isn’t working, we’ll do a strategic pause. We stop providing the program and evaluate if it’s really needed. If so, we rework the offering. If not, we find a way to offboard it."
Particularly in organizations or teams that are unfamiliar or uncomfortable with experimentation, these pauses can help ease into a needed change.
Related Article: Cut Through Coaching Technology Confusion
Delegate Offboarding Decisions to Business Functions
If it’s difficult for L&D functions to let go of a learning method, one solution is to give decision-making responsibility to someone else.
One organization did just that. The L&D function asked various business units to take over the maintenance of the learning methods they use. If a method continues to be useful to the business unit, that business unit keeps it alive. If not, they let it go.
Handing keep-or-cut decisions to the relevant business function struck us as a forward-thinking way to ensure the right decisions are made.
Have an Exit Strategy
In some instances, L&D functions can foster experimentation by ensuring there's an exit strategy baked into the learning methods they try.
One element of this exit strategy is tactical: Make contracts shorter. Large learning technology implementations are sometimes 10 years long. These long contracts allow plenty of time to implement the technology, work out the bugs and see the benefits for a while. However, sometimes an organization knows it wants to experiment and will jump to something better as soon as it’s available. In such cases, contracts need to be negotiated with shorter terms.
Another element of having an exit strategy involves data. As learning technology gets more sophisticated, it becomes more integrated into an organization's tech stack. This means more data flows into and out of the system. Before offboarding a learning method, L&D functions need a plan for what will happen with the data contained in that method.
Building an exit strategy into implementation of learning methods makes experimentation much easier. Although shorter contracts and a data plan may not apply to all learning methods, the idea of having an exit strategy does.
The Future Requires Experimentation
We expect to see more organizations take a broader, more experimental approach to learning methods. The future of employee development in the new world of work will require it.
Focusing on experimentation and iteration can help L&D functions ensure that the learning methods they implement meet the needs of employees and organizations on an ongoing basis.
Learn how you can join our contributor community.
About the Authors
Dani Johnson is co-founder and principal analyst for RedThread Research, and has spent the majority of her career writing about, researching and consulting on human capital practices and technology.
Heather Gilmartin Adams is a senior analyst at RedThread Research. Trained in conflict resolution and organizational development, Heather has spent the past 10 years in various capacities at organizational culture and mindset change consultancies as well as the US Department of the Treasury.