Where the Jobs Actually Are

5 min read

The headline version of 2025 is simple: AI is taking jobs. And for certain roles — the ones built on translating requirements into syntax, summarizing documents, writing boilerplate — that's true. I won't sugarcoat it.

But here's what the doom headlines miss: every wave of automation creates gaps. Not just job losses — skill gaps. Roles that didn't exist 2 years ago are now the hardest to fill. I talk to hiring managers across startups and enterprises, and the pattern is consistent. They're not struggling to find people who can code. They're struggling to find people who can do the things AI can't.

Five roles keep coming up.


The Systems Architect

The "full stack developer" job description is evolving. Companies don't just want a website anymore. They want autonomous systems — customer support that handles 80% of tickets without human intervention, inventory management that reorders based on demand signals, compliance workflows that flag issues in real time.

Building these requires someone who can design how multiple AI agents work together. Which small model handles triage. How they hand off to each other. Where humans intervene. What happens when the agent fails.

AI can generate code for each piece. It cannot design the system that connects them. That requires the kind of systems thinking that comes from years of building and breaking real software. Every serious engineering team I work with is hiring for this.


The Safety and Compliance Engineer

As AI gets deployed in healthcare, banking, and government, the liability question gets real. Biased outputs in a loan decision. Hallucinated dosage in a medical tool. A black-box recommendation that nobody can explain to a regulator.

Someone has to audit these systems. Set up guardrails. Test for failure modes the model's creators didn't anticipate. Ensure compliance with AI governance frameworks that are being written right now in the EU, India, and the US.

You can't ask the model to police itself. This is a human role, and it requires both technical depth and domain-specific legal understanding. Very few people have both, which is exactly why the role pays well. I saw this play out firsthand when we worked on AI in emergency dispatch — the safety architecture was harder to build than the AI itself.


The Product Thinker

AI is a fast builder. Give it a spec and it'll produce a feature in minutes. But it has zero insight into whether anyone wants that feature.

Product thinking — understanding user pain points, reading between the lines of what people say they want vs. what they actually need, finding product-market fit — is irreducibly human. The best product managers I know spend 90% of their time with users, not with code. They translate human messiness into clear goals. Then they point AI-enabled teams at those goals.

In a world where building is cheap, knowing what to build is the expensive skill.


The Data Specialist

Every model is only as good as its training data. The easy internet data has been scraped. The next frontier is high-quality, domain-specific datasets — legal precedents, medical imaging annotations, supply chain logistics, financial compliance records.

Curating these datasets requires deep domain expertise. You need to know what "correct" looks like in cardiology or contract law before you can label training data for it. AI-generated synthetic data has a well-documented failure mode: model collapse, where models trained on AI-generated data degrade over generations. Humans provide the ground truth.

If you have domain expertise in medicine, law, finance, or manufacturing — and you learn how data pipelines work — you're in a category of one.


The Creative Editor

AI generates content at scale. Blog posts, images, marketing copy, video scripts — the volume is unlimited. The result is a flood of competent but indistinguishable output.

The role that emerged in 2025 isn't "content creator." It's creative director of AI output. Taking the 80% that AI produces and adding the 20% that makes it distinctive. Brand voice. Editorial judgment. The instinct for what resonates with a specific audience. Taste.

AI can follow a style guide. It can't create one. It can't tell you why one headline works and another doesn't. That judgment compounds with experience, and it's the one thing you can't prompt your way to.


The Common Thread

None of these roles are about writing code faster. They're about judgment — systems judgment, safety judgment, product judgment, domain judgment, creative judgment.

The market isn't shrinking. It's filtering. The roles being eliminated are the ones where the output was predictable and the judgment was minimal. The roles being created require exactly the opposite: deep expertise, contextual understanding, and the ability to make calls that AI can't.

Average output is approaching free. Specialized judgment has never been more valuable. Figure out where your judgment lives, and double down.


This connects to the broader career shift I've been writing about — the old playbook is broken, but the new one rewards depth over breadth. And software understanding underpins almost every one of these roles.