AI feels unprecedented, but it isn't. Heather Broline breaks down 250 years of tech waves — and what history says businesses should do right now.
Three Dots Show Logo

Heather Broline on What 250 Years of Tech History Tells Us About AI

SAVED
AI feels unprecedented, but it isn't. Heather Broline breaks down 250 years of tech waves — and what history says businesses should do right now.

In Brief

  • AI disruption isn't unprecedented, it just feels that way. Every major technological wave follows the same pattern: first hype and noise, then tug-of-war between tech and society, then eventual value once the two align. AI is no different, we're just in the noisy early phase.
  • Humans shape technology, not the other way around. From seatbelt laws to nuclear treaties, society has always pushed back and redirected powerful technologies. The same will happen with AI, but the sooner people, organizations and governments get involved, the better the outcome.
  • Get your workplace in order before chasing AI. Leaders should focus on whether their organizations have even mastered the last wave of technology, like automation tools, before layering in AI. And when they do adopt AI, bring people in early — including the skeptics — because that friction is valuable, not a problem to steamroll.

Heather Broline has been studying the last 250 years of technological waves for her PhD. Her message for businesses: while AI disruption may feel unprecedented, it's following the same patterns as previous waves.  

The challenge for businesses is how fast they can push through the early noise and inevitable pain to reach the final stage, where humans and technology work together to produce true value.     

Speakers

Siobhan Fagan

Siobhan Fagan

Siobhan Fagan is the editor in chief of Reworked and host of the Apex Award-winning Get Reworked podcast and Reworked's TV show, Three Dots.

Episode Transcript

Table of Contents

Siobhan Fagan: Hi everybody and welcome to today's episode of Three Dots. I'm Siobhan Fagan, Editor-in-Chief of Reworked and I'm happy to be talking today to Heather Broline, an expert in integrating advanced information technologies in the workplace.

I had the distinct pleasure of speaking with Heather before about a big change management initiative that she led at the Air Force. Heather is in the process of getting her PhD on this idea of technological change. People always say those who can't remember history are condemned to repeat it, which is a good way to approach our current phenomenon. We're talking about AI, of course. Everyone's saying it's unprecedented. Heather is here to disabuse us of the notion that it's unprecedented. So, Heather, thank you so much for joining me.

Heather Broline: Thank you for having me. I'm really happy to be here.

Siobhan: So you've been working on this PhD for a while. And in the process of this, you have been studying the last 250 years of techno-economic waves of changes. What brought you to this topic in the first place?

Heather: There are about three things that really propelled me down this path:

  • One is the potential of these new technologies. I am really hopeful of some of these persuasive texts that are out there, that they might make the workplace a better, healthier workplace. Examples of that are technologies aimed at improving workplace culture and civility and pro-social behaviors.
  • Second, exhaustion. I have spent my career helping organizations integrate their technology and their humans and we know the best ways to do this. There are books written, there are experts out there teaching it, yet we continue to fail in our technology implementations at the tune of 70%. So why do we keep failing is a big question that I have.
  • And then third, there's this sense, you said it at the beginning, there's this sense that there's something new, something different about this technology. And I really wanted to know if that was real or imagined.

We know that the technology has grown increasingly intimate. We wear our technology. We are integrating technology into our bodies. So there's something about this technology that's increasingly interconnected to us humans.

Technologists call this the sixth technological long wave. It's the quantum technological revolution. If your audience members are in the manufacturing sector, they might call it industry 4.0, as in the fourth industrial revolution.

This wave is characterized by distributed and decentralized technology that begins by disaggregating information into its smallest bits — be it quanta or DNA or words or sounds or pixels, codes. We are disaggregating that down to the smallest pieces and then re-aggregating it into something of value for industries or for societies. Examples of these technologies in the sixth wave are predictive AI, generative AI, hyper automation, quantum computing, blockchain and then we're also seeing it in the social side with like Wikipedia or Wikonomics or the gig economy. Each one of these technologies or these capabilities is taking that information or society, breaking it down into its smallest bits and reforming it into something new.

Can We Make Direct Comparisons With Previous Technological Waves?

Siobhan: You just mentioned the Industrial Revolution 4.0 as one of these reference points for what this current wave of technology is. And I think that a lot of times we use 4.0, that kind of versioning, as a shorthand so people can compare to previous technological waves. So if you say AI is going to take jobs and people automatically jump to the ATM machines and how ATM machines didn't take jobs.

What do you think about these comparisons? Word processing in the workplace, introduction of personal computers. Are these accurate comparisons when we're looking specifically at the large language models? Can we make that one-to-one comparison?

Heather: Yes, to a degree. Technological waves themselves have repeatable impacts on society. So I'm just going to pull out to the macro level for a minute.

First we see innovation fizzling along in the background, and it doesn't affect us that much. We hear about it in news and anomalies, but it's really just a small perturbation. 

Then something big happens. Some inundated technology drops and it makes a big splash, a big punctuation in the way things are. And it comes with its promises of large profit margins to organizations, sociological change, change to jobs, all of that.

As that wave splashes out, society has a mixed response or reaction to it, both excitement at the new tech, the new potential, and that excitement translates into demand signals to our technology producers, but we also signal fear that this potential new technology is going to change the power structures or change the structures of our society. Those responses serve as waves themselves, they're political demand signals and they go to our political society.

Then you start seeing corporations start spending, sometimes wasting, capital expenditures on this new technology, trying to get some value out of it. You start seeing governments awaken from their slumber and invest in this technology as we saw with solar and alternatives to the combustion engine. Both of those processes start fueling production development and eventually these amplifying signals shape the technology towards something that is actually useful.

We're seeing this of course in AI. Along with the economic process, you also have the policy process. And that could be government policy or corporate policy or social policy, but it also goes through an innovation period. It's the same cycle, a lot of noise and hype at first, a lot of misfires and muddled water. And then eventually you get these two coalitions on either side. One loves the tech and one hates the tech or one wants to constrain and one wants to keep.

The policy swings back and forth like a pendulum until it gets to the point where the opposing views become indistinguishable and we meet somewhere in the middle. That's a dampening signal: first big and then it gets quiet. So most of this activity is, whether that's the policy or the technology or the governments or the people, it's a lot of noise at first. They're all activating, they're all firing, but without hitting their mark. In fact, there's just a lot of misspent energy that occurs long before we get value from this technology, value in our world, value in our economies, or value in our organizations.

What Stage Are We at With AI?

Siobhan: Would you say we're at that phase now?

Heather: I think we're at the noise part. A lot of noise. And the noise is AI, right? And it shifts there, right? There's a shift that happens when we know we're starting to get value. The three streams align. You have technology, you have government and policy, and then you have the problems that need to be fixed.  

It's the same within a corporation, maybe corporate policy. When those three streams really line up, they create this window of opportunity and then you really get substantial change. What happens is the technology is shaped by all of that activity and it really starts to create value. And then you don't hear AI, right? The headlines aren't all about AI takes jobs. They shift.

Let me give you an example because maybe you'll be more inclined to believe me.

Siobhan: You were reading skepticism on my face.

Heather: If you take the story of the automobile, right, first it was simply we motorized horses, right? People weren't even sure they needed motorized horses, but that was what happened. We took transportation and motorized it. But over time, humans pushed back on that technology and shaped it. And then we had seatbelt laws and impact airbags. The technology of motorized transport, it may have scrawled across the world into technological domination. You're hard pressed to find places without automobiles now or aircraft.

But it was redirected by these social systems. It was tamed by humans for human use. So now when we talk about the technology of a motorized car, we don't talk about the motor or the engine. We talk about EPA standards. We talk about oil wars. We forget that all of those are really meant to protect that motorized way of life. But it's just expanded into how we are into our world rather than talking about the technology itself.

How Do You Influence AI's Trajectory Given the Shortened Timeline?

Siobhan: I want to unpack because you touched on a lot of different related threads in those last few responses. One of them was that you said that the innovation was sort of sizzling, fizzling underneath the surface and then it sort of came and there was something that was an obvious release that made the public aware of it. If we stick with the large language model specifically, obviously, artificial intelligence has been around since the '50s in different formats. What was new about ChatGPT when it launched was the fact that it was generative AI and the adoption happened in a two month timeframe.

So the condensed timeframe, the fizzling really did not have to fizzle long. When you talk about the changes that happened with the combustion engine, and we had the cars and then eventually we had seatbelts and we had airbags, but seatbelts weren't until the 70s [editor's note: seat belts became mandatory to install in all passenger cars in 1968]. So there was at least a 50 or 60 year gap before the policy caught up. Are we condemned to repeat history? Are we setting ourselves up where we have this condensed timeline of adoption? We have this condensed timeline of change happening in all these different areas of society, as you said, the political, the societal, the people. But we don't have the condensed timeframe to actually respond to it, do we?

Heather: People who study technological long waves will tell you that the waves are shortening. I'm a little skeptical on that, but that is the general consensus, that the waves of technology last shorter and shorter time periods. So then the question is, does the height of the wave change? Is it more change faster or what? That's an academic debate we're probably not going to get into, but the feeling is still there, that this feels fast.

You mentioned AI has been in the background for a long time, over 100 years. We're based on the quantum thinking over 100 years in the science, right? So this isn't new science. We've been building on the science for a long time. ChatGPT hit and it went crazy, went bananas. But what organizations are getting value out of it? That's a different question. What homes, what organizational groups and societies are getting value out of it? That part is a slower ramp up, but it's still faster nonetheless. It's much faster than the automobile.

But the automobile, the technology wasn't dangerous. The original motorized automobile wasn't the SUVs we have plowing through stop signs now, right? The technology got more dangerous as it went on, and humans shaped it.

I think that gets to the question or the point about what is our role? Is it just this technology that's going to plow us over? I don't think that that's the case. It's not about technology — do we run from it? Do we resist it? It's about our role in shaping these technologies so you have a role, parents have a role, organizations have a role in shaping this technology. Without the voice of the customer, without the voice of consumer advocacy groups or parental groups, this tech will remain unhelpful to our organizations. 

When we get involved, we start shaping it. So when we talk about technology, it's not what they call technological determinism, this idea that the technology is just going to plow us over. Instead, it's social technical. You have the social processes shaping the technology and how it's developed, and you have the technology shaping our social systems. Anthropic is a perfect example of that right now. The government was trying to shape this technology, this AI use. And that organization pushed back. It was a gamble, right? But then the social community is trying to support Anthropic. Within a day, we saw OpenAI come out and say, Yeah, we modified our contract, too. We want them to not use it for massive surveillance with the intelligence agencies.

So those are examples of shaping the potential of this technology and that is what we all need to do to get the most out of it.

The Breaking Point for Organizations

Siobhan: I want to bring this into the workplace context because that is who the Three Dots audience is. But you've raised Anthropic and I can't let that just sit there. Because in that instance, it was specifically the CEO of a company going up against the Pentagon. And then some other CEOs of other super powerful companies going up and saying it. There wasn't — I mean, people obviously were not happy about it about the prospect of mass surveillance — but it was mainly two main actors who were involved in that. So when you talk about the socio-technical moment, where do we as individuals come into it? Because you said we all have a place to play. And I guess I don't see that happening right now, specifically with the large language models and the concentration of power in the hands of the few.

Heather: I wouldn't argue that we don't have concentration in the hands of the few. We do. And that is a feature of each of these technological waves. Each of these ascents, we tend to have that concentration of power. We had it with the steel barons, the robber barons, the oil magnets. This does happen. But it's not in isolation.

If it were just that one voice from Anthropic and just the one voice from the Secretary of Defense, it wouldn't have culminated. Instead, we've had a prolonged and heated debate in government and in the AI companies about the danger of this tech. Now, I'll argue that the techno-rationalists, the tech bros, sometimes they say they're on the upswing. But if we look at any other example in our history, there's always been that backswing of the human potential and why we need to protect it.

We built nuclear weapons. We used them. We controlled them. We built treaties and international treaties and degrees on it. And we still have young men, mostly men, but young men and women out there with nuclear weapons. So the technology is going to persist through, the debate is going to persist through.

Trying to bring that back then into the business, what does it feel like in our organizations? We follow that same path.

So the path I laid out in the bigger economy was this techno push, this rational push, followed by the human or the social backlash and then those two shaping each other throughout time. It's the same in an organization. So first, your management model, we start to radically respond with rationalization of the organization. That means we get the technology, we redesign the org, we get rid of the people that don't work, we train up the new people. It's very focused on shaping to fit the technology. This is a painful time. We lose a lot of good people, we lose a lot of knowledge, we lose productivity. There's confusion. You start questioning the purpose of the organization and the tech implementation.

A lot of organizations fail at this point. But then, and this has happened in every single wave so far, the workforce pushes back and says the technology is here. The organization itself is here to support humans or to provide things for humans. We're not — it's not just humans for technology. We've got that backwards.

When we get to that point, we finally start to see humans valued in the organization and human expertise brought into the process. Examples of that are knowledge management, organizational change management, communities of practice. And then eventually we get to this place of social technical management where we try to maximize the value of both technology and people in a joint value proposition.

When we get here, we get real sustainable value from technology. So first Pro-Tech, then Pro-People, then the integration of the two.

Where Business Leaders Should Focus Their Efforts

Siobhan: If someone in our audience wanted to know where they should be focusing right now, what would you argue based on this historical model? Two things that caught my attention were: we redesigned the org and we trained the people. Those are two things that we've heard over and over again aren't actually happening. Is that where they should be focusing now or what should they be doing in anticipation of these waves?

Heather: Well, if you think of this as a process that first your organization goes through — the pro-tech piece and then the pro-human and then finally integration where you get value — the task to our senior leaders and our managers then is to speed that up. Yes, you're going to adopt new technology. But you want to get your people involved as fast as possible and protect their human expertise. If you adopt this new technology and jettison all of your senior leaders and all your people who have big salaries and have been there forever, and the people who are saying no, you're going to lose a tremendous amount of human expertise, and that's going to set you back.

So you really want to speed this process up to where you get to all three of them. Maybe the question is how does a leader know they are in that process or where they are in that process? I would urge anyone listening right now to look back to fifth wave technology. Where are you in that process? So robotic process automation (RPA), right? It's widely available. Anyone in your organization that uses a computer or a handheld device could be using RPA right now to get more value out of their organization.

So the first question to ask is, are you using RPA? And if you're not, why not? And if you go through this process of the five whys of how you got there, you might find that you haven't gone through this process of rational human and integration yet for fifth wave technologies. And if you haven't done that yet, it's going to be a lot harder and a lot more expensive to go through it with sixth wave technology.

Siobhan: In the case where you're saying that they have to speed it up, your recommendation would be to do an assessment of where they are in their technological stack, make sure that they're actually up to speed with the last wave of technology and then basically scramble to find the value where it balances both the technology and the people as quickly as possible?

Heather: Yeah, it sounds so easy, but ... The work in fifth wave technology will get you ready for the sixth wave technology.

Going back to RPA, it is individualized tech. AI is individualized tech. The reason it's valuable is because you can have a conversation with it. You can interrogate data with it. You can co-create with it. That is one human and one piece of technology working together. It's very intimate. RPA gets you ready for that.

So what does RPA do? It builds your technological literacy for one. Hyper-automation, that's a sixth wave technology, that's AI plus RPA. So you're not going to hyper-automate your environment if you haven't already given your people access to this tool. When they play with this tool, they start to think about their work in terms that the technology can understand. And that translation between humans and technology really happens in setting up good knowledge management, good process management and automating these processes.

If you're sitting in an organization right now and you're on RPA, you're ready for Sixth Wave. Your people already can talk to technology and your technology already speaks the language of your people. Add in AI, you're gonna be off and running. If you haven't gone through that process yet, you probably haven't set your organization up to rapidly go through the rational, the humanistic and then the social integrated processes.

Involve People Early in the Change Process

Siobhan: I'm having flashbacks to our previous conversation where you very much got into the psychology of change management and how people respond to having parts of their jobs taken away or reinvented. It sounds like very much what you're recommending is bringing people in as early as possible so that they have a certain sense of agency when it comes to adopting this technology.

Heather: Absolutely. I'm using the most integrated management models from fifth wave technology, which would be organizational change management, digital transformation, Agile manifesto, agile processes or safe processes. Those are all management models that put humans and technology as close to possible, starting as early as possible in the process. So absolutely.

If you're a leader in your organization and you wanted to influence the shape of technology running through, I would start at the bottom. I would go find the problems in your organization that are small and timely, or small and costly in terms of time. Identify your potential innovators. They are there. Just ask your people. There are tons of folks out there who enjoy solving these micro-problems in your organization and many of them are doing it in secret.

I find that over and over again when I talk to folks, do you have an innovation cell? Yeah, yeah, they're doing that over there. Well, what are you guys doing? And there's all of this small little micro-innovation happening that people are just trying to use whatever technology they can find to solve their problems. So start at the bottom, find these people who are doing that, and then diffuse that wisdom and that skill out from there. They become your first technocrats in your organization. Find these people and nourish them. Give them the resources, the skills, the time, the other people to build these small micro technology — human and technology integrations. Build technocrat pipelines from them, bring new people to them, show them what they're doing and diffuse that technology across.

They're going to have a lot of upstarts, a lot of random processes they make, but what you're building is the internal expertise on your people, your organization, your problems and your technology. And it's that internal expertise that you're going to call on to build the policy and the AI governance and the other supporting structures that a social organization needs to get the most out of that technology.

As you do this, you're building up this potential for change in innovation and technical literacy in your organization. And then you as the senior leader, you open the window. You align your problems, your policy and your politics. That means the politic part is you find that senior leader that's going to be responsible for delivering organizational return from technology. Not just turning technology on, but organizational return. Encourage them to define that organizational value as aggregating from the bottom up. That means count the headaches that you've saved, the minutes that you have saved, the processes you've consolidated, the flexibilities you've found. These are your value-add metrics. These are your social technical metrics.

Only when you have all of these things together, aligned, then you turn to these vendors and let them sell you the latest AI models. Only if you nourish these people and provide them this opportunity to learn and grow with the technology will you be able to speed up that cycle between adopting technology, getting it to fit the humans and then integrating.

Where the Current Technological Wave Differs

Siobhan: So we've been talking about how these technological waves have certain historical precedents, they always follow the same patterns. And yet we've also acknowledged that there are some differences in the cases of our current technological wave in terms of speed and other things. So how much should business leaders be basing any decisions on previous waves of technology, and how much should they be keeping that in mind, but approaching this with fresh eyes?

Heather: I talked about the social patterns and the management patterns. But each technological wave was unique because of the technology and the technology science behind it. So in this wave, we're talking about quantum technology, quantum theory and quantum thinking. That means shifting how we think in our ideas from management to politics to even home and social life.

We want to let go of linear thinking. A lot of the previous automation that we've done and factory management we've done had this linear process through time. With quantum thinking, I urge leaders to think in terms of waves. Waves of technology, waves of social process. And you want to aggregate those waves.

So you might have innovative technology in your organization. You want those waves to amplify. You want the technology coming out of those small little parts of your organization. You want to help structure their activities so that what they do amplifies a desire for change rather than dampening each other out. It's a different way of thinking.

As leaders, we can steer the potential, as opposed to linear processing and your strategy models. We're going to start steering potential of different components in our organization. It's also thinking about entanglement, right? That's a key component of quantum thinking. Think of your systems, your human systems and your technology systems as forever entangled. Human and technology, stop thinking of them as separate input costs to your organization. Changes to one have an impact on the other. Defining one defines the other.

Think instead of these joint integrated systems and how you can amplify their waves to get the impact you need or where you're having problems, how you can use those technologies to dampen the waves and let it simmer down. The leaders who can get their brains around this, around this quantum thinking, around quantum management, they're going to be the ones that ride this wave the best.

Find Your 'Office of No'

Siobhan: I want to close out soon. And I want to bring it back to where we started, where you talked about how the technology doesn't really come about in isolation and how we can affect how it changes. We can influence the flow of it. Right now, there's two extremes. So it's either AI's catastrophic or it's magical. And if anybody pushes against either of those, like if anybody suggests that perhaps it's not magical, perhaps there should be some things we work out about it, what can people do to start shaping the technology without being shoved into one of these two sides?

Heather: You're absolutely right. Technology can't develop in isolation. It is constantly shaped by the social pressures around it. So if you don't have people pushing back, your development and your innovation within your organization is not going to do what you want it to do.

Unfortunately, every wave in the past, we have this bulldozing period where we just steamroll over all of the Luddites or the people who push back on the technology. But we can see this pattern now. We don't have to keep making the same mistakes.

I urge each organization out there, find your office of no. Find the people who are the Luddites or the stubborn or the disgruntled. Record your meetings with them and just let them talk, talk, talk. Ask them what would work, what won't work, how come it won't work, examples where they've seen it fail in the past, the rumors they're hearing, the fears, the hopes, all of their concerns — just let out their wisdom. Don't fire them, gather their wisdom, take these recordings, these transcripts and add it to AI and ask AI, Turn this complaint or this argument from the folks in our organization of why we can't use technology, turn that into requirements for our organization. Turn that into our AI governance or our AI policy.

The wisdom that you capture in these meetings can either be ignored at your peril or they can catapult it into something that is uniquely designed for your organization and your people and their problems. It is a way to bring together both the social needs and the technical needs to guide your organization.

Heather's Response to AI Fears

Siobhan: Heather, is there anything that we didn't touch on that you'd like to raise?

Heather: There are a lot of real fears out there. We have war going on. We have all sorts of change in the environment in terms of jobs and industry. And I think the question that we get asked a lot is, Are all the white collar jobs going to go away? No. Did all the blue collar jobs go away when we added technology to the manufacturing system? No, we started saving lives and safety and the like.

But it wasn't because the technology did that. It was the social structures, it was the human structures, it was the unions, it was the protests, it was all these different societies that came together and pushed back on that technology to shape it to create human value. It'll be the same here with AI. Elon Musk built a factory without humans. Total automation. He said it was the biggest mistake he's ever made. Why? Because every time he had a problem, he had to turn to people.

It's going to be the same in knowledge management and any of these knowledge industries. We're going to create technology and at first it's going to push and cause damage, but the sooner we get involved, we can reshape it and create these new jobs and these new structures and these new techno-social systems.

Siobhan: Has there ever been a case where we've put the genie back in the bottle?

Heather: Yes, if you think about space travel, if you think about nuclear power, we've had a lot of examples. You think about autocratic government. We've had a lot of examples of where humans have just pushed back and said we're just not doing it anymore. And then it slow simmers. AI itself went through several winters where the hype exceeded the joy and then it simmers down and we get distracted with whatever else. So yes, if we start building Terminators, there's probably going to be a pushback and hopefully we just push back fast enough before it's too late.

Siobhan: Before we hear Sarah Connor. On that note, I'm not going to go into Arnold Schwarzenegger impressions, but thank you so much. Thanks so much for joining me, Heather. I always enjoy our conversations.

Heather: I would love to hear that. Thank you so much. Thank you for having me on, I appreciate it.