AI LABOR CULTURE
Who Fired You
Apr 14, 2026
Sarah Perez published a piece in TechCrunch [1] on April 13, 2026, summarizing Stanford's annual AI report. [2] The headline finding is a widening gap between what AI experts believe about the technology and what everyone else believes. [3] The numbers are striking. Seventy-three percent of experts think AI will have a positive impact on how people do their jobs. Twenty-three percent of the public thinks the same. On the economy, sixty-nine percent of experts are positive. Twenty-one percent of the public is. Nearly two-thirds of Americans expect AI to lead to fewer jobs over the next twenty years.
Perez treats the divergence as a puzzle. Why are regular people so negative when the people who know the technology best are so optimistic? The implied answer, never quite stated, is that the public doesn't understand what's coming.
The experts are looking at what AI could do. The public is looking at what AI is being used to do to them.
Imagine a version of this moment in which the productivity gains from AI flowed to the people whose work is being automated. Shorter weeks at the same pay. Higher wages in the sectors where AI handles the drudgery. More security, not less. In that universe, the Stanford numbers flip. The public becomes the enthusiasts, and the experts become the nervous ones, worrying about transition costs and adjustment frictions. Nobody in that universe is burning warehouses over wages.
The technology is the same in both universes. The models, the capabilities, the displacement of specific tasks — all identical. What differs is who captures the gains.
Are we so thoroughly conditioned not to question how capitalism distributes wealth that we can no longer picture the alternative? Suggest out loud that the gains should flow to the people doing the work, and the labels arrive before the argument does. Communist. Socialist. Naive. That's how you know you've touched something.
I grew up in the Ruhr Valley. My father worked in the coal mines. When the mines closed, the people who lost their jobs knew exactly who to be angry at. They had names. Krupp. Thyssen. The ministers in Bonn who signed the deals. The board members who decided which shafts to shut and when. The adversary was a person or a small number of persons, with an address, a phone number and a face that appeared in the newspaper. You could picket the headquarters. You could vote against the party. You could, at minimum, say a name out loud and have everyone around you understand who you meant.
That vocabulary took generations of labor organizing to build. Workers learned to say "the owners" and "the bosses" to refer to specific people who made specific decisions. When the mines closed, the closures were a catastrophe, but a legible one. Someone did this. Someone could be named.
Now read the sentences being written about AI and the job market. "AI is taking jobs." "AI is replacing workers." "AI is coming for the white-collar middle class." The subject of every sentence is the technology. Not OpenAI. Not the CEO who approved the layoffs. Not the board that set the headcount targets. Not the investors who demanded the margin improvements. AI, the noun, as if the models walked into the office on their own, fired the staff, and went home.
"Mistakes were made." "The building caught fire." "AI is taking jobs."
My father's generation lost their livelihoods and kept their vocabulary. Krupp. Thyssen. The ministers in Bonn.
AI is not taking jobs. The people who own it are.
Sources
[1] Sarah Perez, "Stanford report highlights growing disconnect between AI insiders and everyone else," TechCrunch, April 13, 2026.
[2] Stanford HAI, "The 2026 AI Index Report."
[3] "Stanford AI Index Reveals Experts, Public Worlds Apart on AI," TechBuzz, April 2026.