I was at an internal audit conference last Monday. Smart people, good intentions, real conversations about where the profession is heading. And one thing kept coming up that I can’t stop thinking about.

Most of the auditors in the room are using Copilot. Not because they chose it. Because it’s what their company gave them. It integrates with Word, PowerPoint, Excel—the tools they already live in. It’s convenient. It’s approved. It’s safe.

Meanwhile, the rumours from inside Microsoft itself say their own people are being encouraged to use Claude Code instead—because Copilot isn’t good enough.

Read that again. The company that makes Copilot is telling its engineers to use a competitor’s tool. And the professionals who rely on Copilot as their window into AI are forming their entire understanding of what’s possible based on a product that even its creator considers second-best.

This isn’t a technology problem. It’s an investment problem. The auditors at that conference are held back by the tools their businesses gave them—constrained by risk concerns, cost concerns, procurement cycles. And those constraints are creating a gap that widens every week between the people who are genuinely learning what AI can do and the people who think they already know.

My Education Is Obsolete. Yours Probably Is Too.

I have a chartered accountancy qualification, a CIA certification, and a degree from a business university that prides itself on practical education. I’ve spent thirteen years building expertise in audit—external, internal, consulting.

The knowledge part of that education—the frameworks, the standards, the technical procedures—is becoming obsolete. Not slowly. Rapidly. The things I memorised, the patterns I learned to recognise through repetition, the technical steps I can execute reliably—AI does all of that now. And it does it faster, more consistently, and without getting tired at 4 PM on a Friday.

But here’s what isn’t obsolete: the fact that my education taught me how to learn.

That sounds like a platitude. It isn’t. Let me be specific.

My university didn’t just teach me accounting rules. It taught me how to take a complex, ambiguous situation, break it into components, figure out what I didn’t know, find the right sources, synthesise them, and form a judgment. It taught me how to think through a problem I’d never seen before.

My consulting career didn’t just give me technical skills. It gave me the experience of walking into an unfamiliar organisation, understanding its dynamics quickly, identifying what mattered, and communicating conclusions to people who didn’t want to hear them.

Those capabilities—learning how to learn, thinking through ambiguity, communicating judgment—those are more valuable now than they’ve ever been. Because they’re exactly what you need to steer AI effectively.

The knowledge I accumulated? The technical procedures I memorised? Those were the what. And the what is being commoditised in real time. But the how—how to think, how to learn, how to exercise judgment—that’s the part that compounds.

The Training Work Is Disappearing

The Journal of Accountancy published something this week that crystallises the problem: “How will accountants learn new skills when AI does the work?”

Carl Mayes, AICPA’s VP of CPA Candidate Quality, described how vouching—examining transactions against supporting evidence—used to be one of the first things a new auditor learned. You’d spend hours matching invoices to purchase orders. Tedious? Yes. But in the process, you developed an intuition. You learned what felt wrong. You learned which questions to ask. You built judgment through repetition.

AI tools now do that automatically.

Which is great for efficiency. And terrible for development.

Mayes put it bluntly: the profession must pivot from “doing” to “supervising.” But here’s the paradox—supervising requires understanding what you’re supervising. And if you never did the work yourself, how do you know when the machine got it wrong?

This is the question that should keep every professional educator awake at night. The entry-level work that used to be the training ground—the grunt work, the repetitive tasks, the “pay your dues” phase of every knowledge career—is being automated. And with it, the mechanism through which professionals developed judgment is disappearing.

From Doing to Steering

So what replaces it?

David Wood, a professor at Brigham Young University, is building answers. He described 3D simulations where students walk through a virtual warehouse, open boxes, talk to AI-simulated employees, and look for control failures. Virtual vouching. Simulated professional experience.

Interesting. But the part that really got me was his second idea: having students teach a deliberately ignorant AI until it can pass an exam.

“Students don’t remember a lot from sitting and listening to a lecture,” Wood said. “But if they teach somebody else, they remember a ton. So, we thought, what if we had them mentor an AI?”

Instead of reading and testing, students train an AI. They feed it information. They evaluate its responses. They correct its mistakes. They iterate until the AI demonstrates competence.

Think about what that exercise actually teaches. It’s not teaching you accounting rules—the AI can look those up. It’s teaching you how to steer AI toward a correct outcome. How to recognise when the machine has gone wrong. How to provide the right context, the right constraints, the right evaluation criteria.

That’s not a study technique. That’s the job description for the next decade.

What an Audit Student Actually Needs to Learn

If we accept that AI will handle most of the output—and at this point, that’s not a prediction, it’s a description of what’s already happening—then the question shifts.

An audit student today doesn’t primarily need to learn how to write a management letter. They need to learn how to engage an AI agent to write the management letter. They need to understand what a good management letter looks like, what inputs the AI needs, where the AI is likely to cut corners or miss nuance, and how to evaluate whether the output meets professional standards.

They need to know what tools exist. What plugins to use. How to chain agents together. When to use a reasoning model versus a fast model. When to trust the output and when to push back.

They need to understand prompt engineering—not as a buzzword, but as a practical skill. Giving an AI the right context, the right constraints, the right examples is the difference between getting a generic output and getting something genuinely useful. I wrote about this in Your First AI Week: the gap between “AI is underwhelming” and “AI just changed my workflow” is almost always how you use it.

This means white-collar education needs to become radically more practical. Not practical in the old sense—more case studies, more simulations. Practical in a new sense: students need to build things. They need to work with the tools. They need to fail, iterate, and develop instincts that only come from hands-on experience.

The irony is striking. For decades, professional education has moved toward the theoretical—more frameworks, more standards, more exam preparation. The future demands the opposite. More building. More experimenting. More thinking through the tools rather than about the tools.

The Compounding Effect

Here’s where the math gets uncomfortable.

The people who are investing now—investing time, investing money, investing in the best available tools—are compounding their experience. Every week of genuine AI experimentation teaches you something. Every problem you attack with AI sharpens your instinct for what works. Every failure teaches you where the boundaries are.

After a month, you’ve built intuitions that no course can teach. After six months, you have a personal methodology. After a year, you’re operating in a fundamentally different way than someone who’s still doing last year’s job with last year’s tools.

That’s compound interest. And it works the same way as financial compound interest: the earlier you start, the more dramatic the gap becomes.

The people who are waiting—who are risk-averse, who are constrained by the tools their employer provides, who think they’ll “get around to it” when things settle down—are falling behind at an accelerating rate. Not because they’re less capable. Because they’re not accumulating the reps.

I keep coming back to the conference. A room full of smart, experienced professionals. Most of them limited to Copilot. Most of them forming their understanding of AI through a tool that, by all accounts, isn’t even the best option in its own maker’s portfolio.

That’s not their fault. But it is their problem.

If your employer’s approved AI tool is holding you back, supplement it. Get your own Claude subscription. Get ChatGPT Plus. Try Gemini Advanced. The paid tiers cost less than a monthly Netflix subscription, and the capability gap between paid and free is enormous. Between “what IT approved” and “what’s actually best” it’s often even larger.

I’m not saying ignore your employer’s security policies. I’m saying: don’t let the tools you’ve been given define the limits of what you think is possible. Learn on your own time with your own subscriptions if you have to. The professionals who understand what best-in-class AI can actually do will be the ones making the decisions about which tools the firm adopts next.

The Education Race

China is already building AI-native universities—institutions designed from the ground up around AI-assisted learning. Not traditional universities that added an AI module. Institutions where AI is the medium, not just the subject.

I don’t know the exact status in Europe. But I have a strong sense that we’re lagging. The traditional university model—lectures, exams, case studies—was designed for a world where the bottleneck was access to knowledge. The professor knew things you didn’t, and the degree certified that you’d absorbed enough of that knowledge to be professionally competent.

That bottleneck is gone. Knowledge is abundant. The new bottleneck is the ability to use knowledge—to synthesise, to judge, to steer AI toward outcomes that meet professional standards. And our educational institutions haven’t caught up.

The same is true in corporate training. Most professional development programmes are still structured around “learning about AI” rather than “learning with AI.” Webinars about what AI can do. Presentations about the future of the profession. Panel discussions about the implications.

That’s important. But it’s not sufficient. You don’t learn to swim by attending a seminar on hydrodynamics. You learn to swim by getting in the water.

The Business Parallel

Everything I’ve said about individual professionals applies to firms and organisations, too.

The companies investing heavily in AI right now—investing in tools, in training, in time for their people to experiment—are building a competitive advantage that will be extremely difficult to replicate later. Not because the technology is secret. Because the experience is what matters, and experience takes time.

A firm that’s been running AI-augmented audits for a year has learned things about quality control, client communication, pricing, and workflow design that a firm just starting out hasn’t. Those lessons aren’t in a manual. They’re embedded in the organisation’s muscle memory. In its people’s instincts. In its processes and playbooks.

The firms that are being risk-averse—waiting for the standards to be written, waiting for the perfect tool, waiting until AI is “mature enough”—will find that when they finally start, the leaders have lapped them. Not once. Multiple times.

This is the same compound interest effect. Early investment creates learning. Learning creates capability. Capability creates confidence. Confidence creates more investment. It’s a flywheel. And it’s spinning faster for some organisations than others.

So What Do You Do This Week?

If you’re reading this and feeling the gap—between where you are and where you sense you should be—here are three things that will move the needle.

Get better tools. If all you have is what IT gave you, get your own subscriptions to the leading AI models. Not as a replacement for your employer’s approved tools—as a benchmark. Understand what best-in-class looks like so you can evaluate what you’re working with and advocate for better.

Build something. Don’t learn about AI. Build with AI. Take a real work problem—the reconciliation you dread, the report you assemble quarterly, the background research for your next engagement—and try to solve it with AI. Not once. Five times, with different approaches, different tools, different levels of context. The reps are what matter.

Teach the machine. Try Wood’s technique yourself. Take a topic you know well and try to teach it to Claude or ChatGPT until the model can explain it back to you correctly. You’ll discover gaps in your own understanding. You’ll develop a sense for where AI gets confused and where it excels. And you’ll be practicing the exact skill that professional work is becoming: steering AI toward correct outcomes.

The people who do this—who invest the time now, who push past the constraints, who build real fluency through practice—will be the ones shaping the future of their profession. Not reacting to it. Shaping it.

The compound interest is real. And the clock started a while ago.

The Series So Far

Six weeks ago, I had my Oh Fuck moment—the personal realization.

Five weeks ago, I showed you the orchestrator—and connected it to the SaaSpocalypse.

Four weeks ago, I argued that your auditee doesn’t need you anymore—the knowledge advantage is collapsing.

Three weeks ago, I showed that the fees are falling—the economics of professional services are being rewritten.

Two weeks ago, I gave you your first AI week—a practical method to start building fluency.

This week: the education and training model is broken. The work that used to teach you judgment is disappearing. The only way forward is to invest—in tools, in practice, in learning to steer the machines. The compound interest starts now.


If you’re rethinking how you learn, how you train your team, or how your organisation develops AI fluency—I want to hear from you. Get in touch.

Sources and further reading: