human face viewed through a kaleidoscope-like mirror
Editorial

Working Faster, Thinking Better

5 minute read
Maggie Pearce avatar
By
SAVED
Navigating speed, volume and meaning in an AI world.

AI has changed the pace and volume of work. Drafts appear faster, alternatives multiply and what once took days now happens in minutes, with expectations stretching to match the new capacity. Leaders are producing more than ever, yet still feel behind.

This isn’t a story of resistance to AI. It’s what happens when AI accelerates work faster than the conditions around thinking and decision-making can adapt. The tools have moved quickly. The experience of work has shifted with them, often in quieter, less examined ways.

Two tensions show up repeatedly. The first is a volume loop. AI makes it easy to generate large amounts of content and analysis, in seconds rather than days. That output then becomes something others need to process or condense, often by putting it back through AI again. Throughput rises; shared understanding doesn’t keep pace.

The second is a thinking squeeze. As AI increases the pace of work, expectations on speed often rise with it. Turnaround times shorten, responsiveness becomes the norm and the space for sensemaking quietly shrinks, with thinking expected to happen inside a faster flow of work.

Taken together, these dynamics create a very specific kind of pressure: more to absorb, less space to make sense of it.

What Speed Can’t Replace

Despite all this acceleration, some things haven’t changed.

The human work of thinking still involves noticing, exploring alternatives, testing assumptions and connecting actions to consequences. Sensemaking still takes time. People still need to see their workings, to understand how a conclusion was reached and to be able to explain it to others.

This matters most when the stakes are high. Strategy, risk, people decisions, ethical calls — these don’t become simpler just because information is faster to generate. Often, they become more complex because there is more to integrate and weigh.

AI speeds up production, but sensemaking still takes time. That isn’t inefficiency or hesitation; it’s how responsibility works when decisions are made with — rather than delegated to — AI. Leaders aren’t judged simply on answers. They’re judged on the quality of the thinking behind them.

When leaders feel confident about decisions, it’s rarely because they moved the fastest. It’s because they understand the reasoning, the trade-offs and the implications, and can stand behind them.

When Capability Quietly Becomes Expectation

One of the subtler shifts AI has triggered is an unspoken recalibration of what’s expected.

Because AI is available, it can feel as though we should now be able to do more: faster responses, greater output, greater certainty. AI-enabled capacity quickly becomes a new baseline.

The impact isn’t uniform, but the tension is widely felt. As output becomes easier to generate, work can very quickly skew toward more. More material, more options, more documentation. At the same time, expectations rise on pace, leaving less space to think and make sense of what’s been produced.

This isn’t unique to AI. Research on decision-making under pressure consistently shows that as volume and pace increase, clarity and judgment decline unless space for reflection is deliberately protected.

Leaders remain accountable for decisions and consequences, even as the environment makes that accountability harder to uphold.

The Unease of Not Being Able to Stand Behind the Work

This is where discomfort, and sometimes fear, enters the picture.

A number of examples have emerged in recent years of AI being used without sufficient care: fabricated citations, fictional case law, confident-looking outputs that collapse under scrutiny. In some cases, those resulted in lost credibility, trust or even jobs.

What’s striking in these stories isn’t just the error. It’s that the individuals involved couldn’t explain how they got there.

That loss of traceability creates real anxiety. Leaders don’t just worry about being wrong; they worry about being responsible for outcomes they can’t fully defend. Accountability without visibility is deeply unsettling, especially in roles where credibility depends on the ability to explain decisions, not just present results.

Much of the research on human-AI collaboration highlights this same risk: confidence rises when people can see and explain the reasoning behind an output, and drops sharply when they can’t.

So the unease many leaders feel isn’t about AI replacing them. It’s about owning work they don’t entirely understand.

Why the Conditions Matter More Than the Tools

This article isn’t an argument for slowing down or resisting progress. Speed is now an expectation. AI brings genuine value. But as AI accelerates work, leadership increasingly involves shaping the conditions in which human thinking happens alongside AI-generated output.

Learning Opportunities

The conditions we create determine whether AI becomes a support for clarity and confidence, or a source of additional risk. If work isn’t deliberately shaped to include space for sensemaking, even informally, speed tends to overwhelm it. When that happens, clarity erodes, confidence drops and anxiety rises.

This isn’t about adding process or bureaucracy. It’s about recognizing that thinking is part of the work, not something that fits around it if time allows.

When Thinking Has Space, Something Different Emerges

When work with AI allows thinking to happen, something noticeable shifts.

Leaders feel more grounded. More confident. More curious. More engaged. Instead of rushing past complexity, they stay with it long enough to understand what matters.

In these conditions, AI plays a very different role; not as a shortcut, but as a thinking partner within an intentional back-and-forth.

When thinking is rushed, AI accelerates surface-level output.

When thinking has space, AI supports depth, exploration and meaning.

Used this way, AI doesn’t replace human thinking; it responds to its quality. The value moves away from sheer output toward iteration; testing ideas, refining understanding, making meaning visible.

That’s when leaders feel better able to stand behind their work — not because they moved more slowly, but because they thought more clearly.

A Moment to Take Stock

AI has changed the pace of work. It has intensified speed, volume and complexity.

Navigating this moment isn’t about choosing between faster work and better thinking. It’s about noticing how the conditions of work shape our ability to make sense of what we’re doing, and what we’re accountable for.

Where might speed be crowding out clarity?

Where does thinking feel squeezed, rushed or invisible?

And where might small shifts in how we work with AI make better thinking possible again?

These questions don’t have definitive answers, but staying alert to how speed and volume affect judgment is key. Leaders can begin by deliberately building short pauses for sensemaking into fast‑moving work - making thinking visible and treating it as part of the work itself.

Editor's Note: How else can businesses balance the demand for speed with the demand for thoughtful decisions?

fa-solid fa-hand-paper Learn how you can join our contributor community.

About the Author
Maggie Pearce

Maggie holds a global role at Impact, where she leads the development and sharing of Impact’s learning practice while designing and delivering some of its most complex client solutions. She is the creator and pioneer of Solution Mapping, Impact’s consultancy framework, and brings deep expertise in evaluation strategies, leadership simulations and innovative solution design. Connect with Maggie Pearce:

Main image: Khanh Nguyen | unsplash
Featured Research