TL;DR: The tech layoffs dominating the news are not temporary economic blips; they are the systemic slaughter of the standardized middle class. For decades, upward mobility was guaranteed by acquiring "standardized skills" (basic coding, accounting, legal review). But AI is a hyper-efficient standardization engine. Software development, once a massive middle-class employer, is being gutted as LLMs write and review their own code. As AI creates an infinite oversupply of average information and standard labor, the only way to survive is to abandon standardization and cultivate the three un-computable human traits: Taste, Trust, and the "Weird."
I spent last week scrolling through LinkedIn at 2 AM, watching the layoff posts stack up like planes in a holding pattern. Senior engineers with fifteen years at Google. Product managers from Meta. Data analysts with perfect pedigrees. The comments were all the same—shock, denial, "how could this happen to me?"
And I kept thinking: this isn't a recession. This is something else entirely.
The Factory is Closing
My first real job was at a software consultancy in Hong Kong where we basically did digital assembly line work. An architect would draw the blueprint, then twenty of us would write the code line by line—CRUD apps, database migrations, API integrations. It was standardized enough that we had checklists. Standardized enough that we hired fresh graduates and had them productive in two weeks.
That model is dying in real time.
I watched a CTO demo last month where he described his new workflow: the senior architect still draws the blueprint, but now the "coding" is just… prompt engineering. The LLM writes it, tests it, debugs it. The middle layer—the people who used to translate designs into syntax—are simply gone. Not "reskilled." Not "transitioned." Just obsolete.
The India IT outsourcing giants are already feeling it. The coding bootcamp graduates who mortgaged their futures for Java certificates. The mid-level developers who thought "full stack" meant job security. They're all competing with a tool that doesn't sleep, doesn't ask for raises, and generates infinite mediocre code instantly.
And software is just the beginning. I have friends in accounting firms watching AI review contracts faster than their first-year associates. Friends in legal teams seeing document discovery handled by algorithms. The standardized middle—the people who learned a skill that could be written down in a textbook—is evaporating.
The Teacup Problem
But here's where it gets weird. I was in London few years ago, walking through Mayfair, and I popped into this ceramics shop. Teacups for £2,000. Not antique—new. Made yesterday in the same Staffordshire factories that churn out £5 mugs for Tesco.
I asked the owner how he justified the price. He just smiled and said, "Industrial efficiency made the cup cheap. It didn't make the choice cheap."
That's the whole game now. AI is flooding the market with infinite "B-minus" work—infinite competent blog posts, decent code, average designs. Which means the value isn't in production anymore. It's in curation. In taste. In the ability to look at ten thousand AI-generated options and know which three actually matter.
I used to bill hours for building things. Now I bill for deciding—for having the judgment to say "this AI output is good enough to ship" or "this needs the human touch" or "actually, the client doesn't need this feature at all despite what the spec says."
The Flaw Premium
I had coffee with a friend who runs algorithm strategy at a major streaming platform. He told me something that stuck: "The algorithm doesn't actually want perfection. It wants gravity—the thing that makes someone stay."
We gravitate toward people because they're inconsistent. Because they change their minds. Because they get angry about weird things and have inexplicable obsessions. An AI will never write a newsletter, have a crisis of confidence, pivot hard, and accidentally stumble into something brilliant while trying to fix the mess it made. But humans do that constantly.
If you sanitize your work—if you use AI to smooth out all the edges, make every sentence balanced, every opinion defensible—you become invisible. You become part of the background noise. The people who are thriving right now are the ones leaning into their specific weirdness, their non-replicable biases, their messy human scent.
The Split
I'm watching the market separate into two distinct species. On one side: the architects, the taste-makers, the people who design systems or curate experiences or provide emotional context that can't be documented. On the other side: a vast sea of commoditized labor competing with machines that get cheaper every quarter.
There's no "middle" anymore. You're either the person who owns the judgment call, or you're competing with software that doesn't need health insurance.
I don't think this ends with universal basic income or mass riots. I think it ends with a lot of people doing something else—becoming "human companions" for small communities, niche curators, physical craftspeople, or simply finding work that requires literal presence and emotional risk.
The Actual Question
So I've been asking myself—and I think anyone reading this should ask—what's the thing I can do that an AI can't? Not "what's my current job title," but what's the actual, specific, messy human problem I solve?
For me, it's not writing code anymore. It's sitting in a conference room in Tokyo, watching a client's shoulders tense when I mention their competitor, noticing they haven't touched their coffee, and realizing the real problem isn't technical—it's political. It's having the scar tissue to say "I've seen this exact failure pattern before at two other firms, and here's how it actually plays out."
That's not a standardizable skill. That's just… being a specific person, at a specific time, with specific damage.
In a world drowning in competent averages, your particular damage might be the only thing you have left to sell.
— James, somewhere between Hong Kong and Tokyo, March 2026



