The Diploma Doesn’t Cover the Gap
Colleges are doing their job. Businesses have changed theirs. And the graduates caught in between are inheriting a labor market that no longer needs what they were trained to deliver.
I’ve written before about what AI means for K–12 education — the anxiety in the room when teachers realize their students are already living in a world the curriculum hasn’t caught up to yet.
Higher ed has a different problem. And in some ways, a significantly harder one.
Because the issue isn’t that colleges are failing. Most are doing exactly what they’ve always done — and doing it well. Critical thinking. Research. Writing. Analysis. Communication. The fundamentals of an educated mind.
The issue is that the world those graduates are stepping into has fundamentally changed what it needs from them. And the gap between what a diploma signals and what a business now demands has never been wider.
That gap has a name. And it’s worth being honest about what’s living inside it.
The Jobs Are Gone. Not Shrinking. Gone.
For decades, the implicit deal between colleges and employers was straightforward. Businesses hired recent graduates for task-driven knowledge work — the kind that required literacy, attention to detail, and the ability to follow a process. Research. Filing. Drafting. Reviewing. Organizing. Formatting. Structuring. The unglamorous but essential connective tissue of how organizations operated.
Those jobs were the on-ramp. The place where new graduates learned the business, built context, proved themselves, and eventually moved up.
AI ate the on-ramp.
Not gradually. Not partially. Completely. Researching a topic, drafting a first document, building a summary deck, formatting a report, organizing a database — all of it is now faster, cheaper, and increasingly more reliable when delegated to AI than when assigned to a 22-year-old three months out of school.
This is not a trend. It is not a disruption that will stabilize. It is a permanent restructuring of what entry-level knowledge work looks like — and what it requires.
The floor didn’t just rise. It shifted entirely.
What Businesses Are Actually Asking For
Here’s what I hear when I talk to business leaders — and I talk to a lot of them.
They’re not struggling to find people who can complete tasks. They have AI for that. What they cannot find — and cannot train fast enough — are people who know what to do before the task begins. People who can look at a messy, ambiguous problem, decide what question is actually worth asking, direct AI to help answer it, evaluate what comes back with genuine skepticism, and communicate a decision with clarity and accountability.
That’s not an AI skill. That’s a human skill that AI has made non-negotiable.
The irony is sharp: AI has made the outputs of knowledge work cheaper and faster — while simultaneously raising the bar for the judgment behind them. The work product is easier to produce. The discernment required to know whether it’s right has never been more valuable.
And colleges — through no fault of their own — have not been asked to teach that discernment in the context of an AI-driven business environment. Because… until now, there was no urgency to.
The Real Curriculum Gap
Let me be precise about what I’m not saying.
I’m not saying colleges need to become vocational schools. I’m not saying the liberal arts are obsolete or that the fundamentals need to be replaced. If anything, I strongly believe the opposite is true — the skills that a rigorous college education builds are exactly what the AI economy most desperately needs.
Critical thinking. Synthesis. Written precision. The ability to argue from evidence and know when you’re wrong.
The gap isn’t in the skills. It’s in the scaffolding.
Scaffolding — in the architectural sense — is a temporary structure that lets workers reach places they couldn’t otherwise access. It’s not the building. It’s the bridge between where you are and where you need to be.
That’s what’s missing for college graduates entering the workforce today. The raw materials are there. The foundational skills are real. What doesn’t exist is any structured preparation for how to apply those skills in an environment where AI is changing how organizations think, operate, communicate, and make decisions.
Knowing how to write a strong argumentative essay is excellent preparation for the workforce. Knowing how to evaluate whether an AI-generated business case contains a flawed assumption — that’s the skill that gets you in the room.
Colleges are building the structural foundation. Without the scaffolding, grads are left on the ground floor.
What Scaffolding Actually Looks Like
This isn’t about teaching students to use ChatGPT. Every student already knows how to use ChatGPT. The tool fluency isn’t the problem.
The problem is that no one has taught them how to think alongside these tools in a professional context — how to frame the problem before they prompt, how to interrogate the output before they trust it, how to own the judgment call when the machine can’t make it for them.
Four skills sit at the core of that scaffolding:
Problem framing — the ability to define what actually needs solving before handing it to a machine. AI is extraordinary at answering questions. It is far less good at figuring out which question is worth asking. That still requires a human to guide the process.
Critical evaluation — the discipline to interrogate AI output rather than accept it. To notice when something sounds authoritative but contains a subtle error. To know when the confidence of the answer should make you more suspicious, not less.
Synthesis over retrieval — the capacity to connect ideas across disciplines and arrive at an original insight. AI retrieves. It aggregates. It can even summarize well. AI is remarkably good at making connections across sources. What it is less equipped to do on its own is extend them — to shift the frame, force the alternative, or see the problem from a fundamentally different angle. That kind of synthesis tends to emerge from human interrogation: pushing back on the output, reframing the question, bringing context the model wasn’t given.
Precision communication — the ability to direct AI clearly and communicate judgment calls to stakeholders with accountability. In a world where words are now instructions — where how you prompt determines what you get — the ability to say exactly what you mean has never been more consequential.
These are not new skills. They are the skills a rigorous higher education has always built. What’s new is the context in which they must operate — and the urgency with which they must be applied.
The Unexpected Upside
Here’s what makes this genuinely exciting, rather than just alarming.
The graduates who develop these skills won’t just be better employees. They’ll be the people those organizations most need right now — because contrary to what we hear in the media (shocking, I know) — most businesses haven’t figured out how to deploy AI effectively yet either.
The entry-level knowledge worker who understands how to evaluate where AI can add value, how to identify which processes are ripe for automation, and how to communicate that case to leadership — that person isn’t competing with AI. That person is running it. And they’re worth far more than the task-executor they replaced.
The on-ramp didn’t disappear. It was rebuilt one level higher.
Which means colleges have a genuine opportunity here — not to chase a technology trend, but to do what they’ve always done best: prepare students to navigate a world that is more complex and more demanding than the one that came before it.
The skills are already in the building. The scaffolding just needs to be added.
What I’m Calling For
I don’t believe the answer is a new AI elective buried in the back of the course catalog. That’s a hedge, not a strategy.
What I’m proposing is a curriculum thread — a deliberate, discipline-spanning layer that teaches students how to apply the foundational skills of a college education in an AI-driven professional environment. Not instead of what’s already being taught. On top of it. Scaffolding on the building that already exists.
This thread could begin in freshman writing — teaching students to interrogate AI output the same way they’re taught to interrogate a primary source. It could continue through discipline-specific coursework, where the grade is on the judgment behind the AI, not the output it produced. It could culminate in a senior capstone that asks students to run a real project with AI as a working resource — directing it, evaluating it, and standing behind every decision it supported.
The liberal arts college that does this won’t just be producing graduates who can survive the AI economy. It will be producing the people who shape it.
That is exactly what these institutions have always been for.
The gap between what a diploma delivers and what businesses now demand is real — and it’s widening. But it’s not permanent. The skills are already there. What higher education can provide now is the scaffolding to put them to use. That’s not a disruption to the mission of a college education. It’s the next chapter of it.



