The Innodative Disruptor

Who's Gonna Drive You Home?

It's hard to avoid the incessant bombardment of dire threats about the end of work as we know it. The pressure drives workers to question their employer's intentions, and parents and their children to question their future. Authoritative figures seem happy to tell us how artificial intelligence is changing the nature of white-collar work.Public statements and interviews by OpenAI CEO Sam Altman, Anthropic CEO Dario Amodei, MS CEO Satya Nadella, Google CEO Sundar Pichai, and other technology leaders over the last year predicting rapid, large-scale disruption of white-collar work, particularly at the entry level. Entire job categories are predicted to disappear, replaced by technologies that are faster, cheaper, and increasingly capable. The timelines are short. The confidence is high. Change is coming tomorrow!

I've heard this story before.

A few years before Covid, I was driving my youngest son to the DMV so he could get his license, a rite of passage familiar to many of us.

For much of the recent past, being able to drive meant freedom—especially for youth. Driving a car meant control over where you could go, with whom you might hang out, and how you moved through the world. From Springsteen's Born to Run to Tracy Chapman's Fast Car, driving wasn't just transportation; it was agency. It was independence. It was adulthood. It was quintessentially American!

As I watched him drive, however, I was struck by a sudden thought: he was in the last generation to perform this ritual. As I told him at the time, when he is in my seat, cars will simply drive themselves, so what need will there be for a license? This moment was unlike the ones I experienced with my other children: Less a rite of passage and more like the end of the line.

Given the outsized role cars play in popular culture, it wasn't surprising that autonomous cars captured the public, and my, imagination so quickly. The promise wasn't just better traffic flow or fewer accidents. It was the belief that one of the most cherished youthful dreams—sitting behind the wheel—might no longer exist. Or, as Ric Ocasek, lead singer for The Cars, asked back in 1984, "Who's gonna drive you home tonight?"

At the time, the answer seemed obvious: the car itself.

The Promise of Full Autonomy

Early demonstrations of self-driving technology were impressive enough to make full autonomy feel both inevitable and imminent. High-profile competitions, rapid technical progress, and ambitious bets by major technology companies created a widespread belief that human drivers were living on borrowed time.The DARPA Grand Challenge (2004–2005) is widely regarded as a catalyst for modern autonomous vehicle research. In the mid-2010s, auto industry leaders such as Elon Musk, Mary Barra, and Mark Fields predicted fully autonomous vehicles within a few years, prompting substantial investment across the automotive and technology sectors. Once machines could handle most driving scenarios, it seemed reasonable to assume the rest would follow quickly.

What followed was not failure, but something more instructive.

The Hard Part

Autonomous vehicles continued to improve, and in many controlled settings they now work remarkably well as demonstrated by Waymo's large-scale driverless deployments and Tesla's early unsupervised testing. But the difficulty lay in moving from mostly autonomous to fully autonomous, that is, the last few percent. The remaining gaps: rare situations, unpredictable human behavior, construction zones, bad weather, and questions of responsibility, turned out to matter far more than their frequency suggested.Construction zones, temporary lane markings, emergency scenes, and unpredictable human behavior remain among the most challenging scenarios for autonomous systems. These 'long-tail' cases dominate residual risk despite representing a small share of total driving time. These scenarios also challenge human learners, but autonomous systems face a higher standard. Any accident draws intense scrutiny, triggers regulatory review, and amplifies public hesitation in ways that learner accidents do not.High-profile incidents involving autonomous vehicles significantly affected public trust and slowed deployment across multiple U.S. cities. Humans stayed in the loop: sometimes behind the wheel, sometimes monitoring remotely, sometimes simply remaining accountable.Even in fully driverless deployments, human involvement persists through remote monitoring, fleet oversight, and legal accountability structures.

The lesson wasn't that autonomy was impossible. It was that full replacement required something close to total coverage. And that the final stretch was slow, expensive, and socially complex.

What This Means for Work

I believe that this dynamic offers a useful way to think about current debates around AI and white-collar job displacement.

Today's AI tools can already perform many professional tasks such as drafting text, analyzing data, writing code, and summarizing documents. In specific contexts, they are often faster and more consistent than humans. As with autonomous driving, early demonstrations create a powerful sense that complete replacement is imminent; and these warnings sound urgent, even inevitable.

But replacing a job is different from assisting with one. Full replacement implies something closer to autonomy than augmentation. It means handling not just the common cases, but the exceptions. Not just routine execution, but the moments where judgment, explanation, and accountability matter most. Those moments may be infrequent, but they carry disproportionate weight. The last few percent matter!

Here, the analogy to autonomous vehicles is imperfect but instructive. In driving, near-total reliability is non-negotiable; anything less puts lives and property at risk. In most white-collar work, the calculus is different. Organizations can capture much of the value of AI without eliminating humans entirely. Keeping people in the loop by reviewing, approving, or intervening, often remains cheaper, safer, and more acceptable than pushing for full automation.This is often discussed under the comparative advantage thesis.

Popular culture provides useful reference points. For example, in Top Gun: Maverick, the aging pilot isn't sidelined by autonomous systems or newer technology. Instead, his value lies precisely in what the machines lack: judgment at the edge, intuition under uncertainty, and responsibility when the stakes are highest. The future arrives, but it turns out to work better with humans still in the cockpit.

However, many believe AI represents a fundamentally different challenge than autonomous vehicles, either cars or fighter jets, one that will genuinely displace knowledge workers. Geoffrey Hinton, whose 2024 Nobel PrizeIn October 2024, the Royal Swedish Academy of Sciences awarded the Nobel Prize in Physics jointly to Geoffrey E. Hinton and John J. Hopfield 'for discoveries that revealed the physical principles of learning in artificial and biological neural networks, transforming our understanding of adaptive complex systems.' recognized his foundational contributions to AI, certainly thought so. In 2016 he famously stated that we should stop training radiologists because machines would soon do a much better job. Yet in spite of being a world-famous AI expert, that prediction hasn't aged well.

The need for radiologists hasn't dropped, it has risen; despite AI tools being frequently used by the profession.See, for example, the recent Financial Times article on AI and radiology. One main reason for this result is the few percent problem, humans plus artificial intelligence often perform best. AI can quickly point out anomalies, but humans can catch things outside the AI's training data. But that is only part of the story. With AI's help, radiologists can do more in the same time, meaning the price of a radiology consult will drop, and the demand for their service grows;This is an interesting example of Jevons's paradox: When technological improvements make a resource more efficient, overall consumption of that resource often increases, not decreases. Or, as discussed here, as radiologists become more efficient with the use of AI, the demand for their services can increase. they are not an endangered occupation.

Sharing the Wheel

To be clear, I am not saying that work won't change. It already has. Tasks are being compressed, roles reshaped, and expectations adjusted. But the path forward is likely to be slower and more uneven than early demos suggest or that AI leaders might hope.

Autonomous cars did not fail; they continue to mature. They taught us that the hardest part of automation is not getting machines to work most of the time, but deciding when we are comfortable stepping aside. AI and jobs appear to be following a similar trajectory. Progress will continue. Capabilities will improve. But humans may remain in the driver's seat longer than many early predictions assumed; not because the technology stalled, but because the final handoff is harder than we think.

In 1984, The Cars asked who would drive us home. Forty years later, the answer is more complicated than they thought.

The path forward isn't choosing between human or machine; it's learning when to steer and when to let the system assist. In full transparency, I used AI while writing this pieceIn writing this piece, I used Claude, ChatGPT, Gemini, and Perplexity to research autonomous vehicle history, refine arguments, and draft sections of text. The structure, reasoning, and conclusions are my own, an example of the human–AI partnership I promote in this thought piece.—not to replace my thinking but to sharpen it. That partnership feels like the right model for this story. I'm still responsible for the destination. I just didn't drive every mile myself. Now if I only had a car that did the same.