Isaac Roach

CS Students: AI Tools Aren't Optional Anymore

The industry has shifted faster than most curricula have. Here's what that means for students graduating into a world where your AI fluency matters as much as your algorithms course.


I talk to a lot of new grad engineers. Some of them are shipping production code with Claude or Copilot in their workflow from day one. Others — equally smart, equally well-credentialed — are treating AI tools as a shortcut to avoid, something that might atrophy their "real" skills. The gap between those two groups is already visible in their output. In two years, I think it'll be hard to miss.

This isn't a post about AI hype. It's about a practical shift in what the job of a software engineer actually looks like right now, and what it means to be prepared for it.

The job has changed, even if the job posting hasn't

Most CS job postings still list the same things they did in 2019: Python, system design, data structures, cloud. Those fundamentals matter — I'll come back to that. But the day-to-day work at most engineering teams has changed significantly. Developers are writing less raw boilerplate and spending more time on directing, reviewing, and integrating AI-generated code. The bottleneck has moved from "can you write this?" to "can you evaluate whether this is correct, secure, and maintainable?"

That's a different cognitive task. And it requires you to be fluent with the tools well enough to know when they're hallucinating a plausible-looking but wrong API call, or when they've generated code that's technically functional but silently ignores edge cases.

If you've never used these tools seriously, you don't have that calibration. That's not a character flaw — it's just a skill gap, and it's a closeable one.

The tools worth knowing right now

The landscape is moving fast enough that specific tool rankings will date this post quickly, but there are a few categories worth investing in:

The most important meta-skill here is learning how to prompt well. Clear, specific, context-rich prompts get dramatically better results than vague ones. That sounds obvious, but it takes real practice to internalize — and most people don't practice it systematically.

What "left behind" actually looks like

I want to be direct about the risk here, because I've seen it framed too softly. Engineers who don't develop AI fluency aren't going to be fired immediately — most of them are good engineers, and that still counts for a lot. But they will be slower. They'll spend hours on tasks that their peers can knock out in thirty minutes. They'll struggle to evaluate whether the AI-generated code on their team is good. They'll miss the shortcut on context-gathering and debugging that others are using by default.

Over time that velocity gap compounds. It shows up in performance reviews, in what projects people get assigned, in who gets promoted. The engineers who thrive aren't the ones who use AI the most indiscriminately — they're the ones who've developed the judgment to know when and how to use it well.

The fundamentals argument (it's not wrong, but it's incomplete)

A lot of CS professors push back on AI tools with some version of: "If you don't learn to write the code yourself, you won't understand what the AI is generating." That's a real concern and I don't fully dismiss it. Strong fundamentals — data structures, algorithms, operating systems, networking — give you the mental models to evaluate AI output critically. You need to understand why an O(n²) solution is a problem before you can recognize that the AI just handed you one.

But "learn the fundamentals" and "learn to use AI tools" aren't in tension. The engineers I most respect use both: they have strong foundations and they're fluent with the tools. The framing where you have to choose is a false one, and it's increasingly an excuse for curricula that haven't caught up.

Use AI tools as a learning accelerator, not a replacement for understanding. When you generate code, read it. When something is unfamiliar, ask the model to explain it. When you're not sure something is correct, verify it. Treat the output as a first draft from a capable but sometimes unreliable collaborator, not a finished product.

Some practical advice if you're in school right now

The students who will enter the workforce in the strongest position aren't the ones who treated AI as a threat to their development, or who used it as a crutch to skip the hard parts. They're the ones who got genuinely curious, built real things, and developed honest judgment about what these tools are and aren't good for.

That's not a high bar. It mostly just requires taking it seriously.

IR
Isaac Roach
Coder