Just Because AI Can Generate Code Doesn't Mean It's Good Code

Published on January 23, 2026

There’s a fantasy floating around tech circles that AI is about to make software developers obsolete. The logic goes something like this: AI can write code now, therefore anyone can build software, therefore we don’t need programmers anymore. It’s a seductive idea if you’ve never actually shipped production software.

I’ve been using AI coding assistants daily for well over a year now. Claude, Copilot, Cursor, the works. And here’s what I’ve learned: AI is genuinely transformative for experienced developers. It’s also genuinely dangerous in the hands of people who don’t know what they’re looking at.

Let me explain.

The term “vibecoding” has entered the lexicon to describe the practice of just prompting an AI, accepting whatever code it spits out, and hoping for the best. You don’t really understand what it’s doing. You don’t review it properly. You just vibe with it. The code works (sometimes), and that’s good enough.

Except it’s not good enough. Working code and good code are not the same thing. Code that runs today might be a security nightmare. It might be unmaintainable. It might scale about as well as a paper boat in a tsunami. AI doesn’t know your system’s architecture. It doesn’t know your team’s conventions. It doesn’t know that the clever solution it just generated is going to make the next developer (probably future you) want to throw their laptop out the window.

When I use AI to write code, I’m not accepting it blindly. I’m reading every line. I’m asking myself whether this fits the patterns we use elsewhere in the codebase. I’m checking for edge cases the AI didn’t consider. I’m refactoring the output to match our style. The AI gives me a first draft, and I edit it into something I’d actually be proud to commit.

That’s the skill multiplier effect. For someone who already knows what good code looks like, AI is like having a junior developer who types at the speed of light and never gets tired. You still need to review their work. You still need to guide them. But they can handle the boilerplate, suggest approaches you hadn’t considered, and help you move faster on the tedious stuff.

For someone who doesn’t know what good code looks like? They’re just accepting that junior developer’s output without review. And that junior developer, while enthusiastic, has read a lot of Stack Overflow answers from 2019 and thinks that’s still best practice.

Here’s an analogy I keep coming back to. YouTube has been around for two decades now. You can find videos explaining literally anything. How to fix your toilet. How to replace brake pads. How to rewire a light switch. The knowledge is right there, free, with step-by-step instructions and helpful annotations.

And yet we still have plumbers. We still have mechanics. We still have electricians.

Why? Because watching someone do something and actually being able to do it yourself are wildly different skills. Because knowing which YouTube video applies to your specific situation requires expertise. Because understanding when the video’s advice will work and when it’ll flood your bathroom or set your house on fire takes years of experience.

The existence of freely available knowledge didn’t eliminate trades. It just meant that people who already had some aptitude could level up faster. A mechanically-inclined person can now diagnose their car’s weird noise by watching videos and checking forums. They were always going to be decent at this stuff. YouTube just accelerated it.

AI coding tools are the same. They’re YouTube for programming, except instead of watching someone else do it, you get a robot to do the typing for you. The fundamental dynamic hasn’t changed. You still need to know what you’re doing to use these tools effectively.

I’ve seen the vibecoded projects. We all have at this point. They’re a particular kind of mess. The code technically works but it’s inconsistent. Three different patterns for the same problem in three different files. Security holes you could drive a truck through. No error handling because the AI assumed the happy path. Dependencies that haven’t been updated since 2023 because the AI’s training data is stale.

And the people who wrote these projects genuinely believe they’ve become programmers. They built something, didn’t they? It runs, doesn’t it?

This is the Dunning-Kruger effect weaponised by technology. AI tools make it trivially easy to produce something that looks like competent output. They’ve lowered the floor for getting started while simultaneously creating the illusion that the ceiling doesn’t exist.

The ceiling very much exists. The gap between “it works on my machine” and “it’s production-ready, secure, maintainable, and scalable” is enormous. That gap is called software engineering, and no amount of prompting is going to bridge it if you don’t understand what’s on the other side.

I’m not saying AI coding tools are bad. I use them constantly. They’ve made me measurably more productive. When I’m working in an unfamiliar language or framework, they help me get up to speed faster. When I’m doing repetitive refactoring, they handle the grunt work. When I’m stuck on an approach, they suggest alternatives I might not have considered.

But I know what I’m looking at. I can tell when the AI has given me something clever versus something stupid. I can recognise when it’s hallucinated an API that doesn’t exist. I can spot the subtle bugs that would only manifest in production under specific conditions.

That knowledge didn’t come from prompting. It came from years of writing bad code, shipping it, watching it break, and learning from the experience. There are no shortcuts to that. You can accelerate the learning process, sure. But you can’t skip it entirely and expect to produce professional-quality work.

So the next time someone tells you AI is going to replace developers, ask them this: did YouTube replace plumbers? Did Wikipedia replace teachers? Did Google replace librarians?

Tools don’t replace expertise. They amplify it. And if you don’t have expertise to amplify, you’re just making a mess faster than you could before.