I get asked this question constantly. By junior devs worried about their first job. By people considering CS degrees. By non-technical managers who read a headline and are now wondering why they pay us so much.

Here is my honest answer, arrived at after 12 years of writing code and 18 months of working daily with AI tools: it's complicated, and anyone giving you a simple answer is either selling something or hasn't thought about it carefully enough.

What AI Is Actually Good at Right Now

Let me be direct about this because the AI-boosters undersell how capable these tools already are. AI is genuinely excellent at:

  • Writing boilerplate and CRUD code
  • Translating between languages and frameworks
  • Writing unit tests for well-defined functions
  • Documentation generation
  • Explaining unfamiliar code
  • Simple bug fixes where the error message makes the cause obvious

This is a lot of what junior developers spend their time doing. I won't pretend otherwise.

What AI Is Genuinely Bad At

The AI-doomers undersell the gaps too. These tools struggle significantly with:

  • Understanding business context — Why does this feature exist? What tradeoff was made when this was designed? What will break in six months if we change this?
  • Novel problem-solving — When the solution isn't in the training data because it involves a new combination of tools or an unusual constraint, the model often produces plausible-looking nonsense.
  • Systems thinking — The effects of a change across a large, evolving codebase, including the human and organizational effects
  • Working with humans — Figuring out what a stakeholder actually wants vs. what they said they want
  • Long-horizon planning — How do we architect this system to support requirements we'll have in two years?

What I Actually Think Is Happening

The nature of the job is changing faster than the number of jobs is decreasing. Ten years ago, a lot of developer time went to writing boilerplate that frameworks then eliminated. Did the number of developer jobs collapse? No — it shifted what developers spent their time on.

AI is doing something similar but faster. The boilerplate is disappearing. The parts of the job that remain — and are becoming more important — are the parts that require judgment, communication, system design, and the ability to work effectively with other humans toward a goal.

This is a problem specifically for early-career developers. The tasks that used to teach junior devs the craft of programming — the CRUD endpoints, the simple bug fixes, the test writing — are now tasks that AI can do. The learning path is getting harder to navigate.

My Honest Advice

Learn to use AI tools well. Not because they'll replace you faster if you don't, but because developers who use them effectively are genuinely more productive now, and that gap will grow. You can't hide from the tools.

But also: deliberately work on the things AI isn't good at. Practice system design. Build your ability to communicate with non-technical stakeholders. Develop opinions about architecture. These skills are becoming more valuable as AI handles the routine parts of the job.

I'm not particularly worried about my career. I am genuinely concerned about the next generation of developers trying to learn the craft in an environment where a lot of the traditional learning opportunities are being automated away. That's the harder problem, and nobody is talking about it enough.