AI has a tendency to make the Line Go Up. It also has a tendency to make Money Go Up.
But that doesn’t mean that everyone’s happy, and for a good reason: The value capture around AI tends to be very, veeeery unevenly distributed. The value is captured almost exclusively at the top of the value chain, by a handful of companies.
All the rhetoric we’ve been hearing about how generative AI makes us more productive? It might, eventually, do that (or not, we’ll see). But just like when washing machines and dishwashers were invented, it didn’t free up the time of those doing the laundry and dishes before (predominantly women), it just raised the expectations of what else they should be taking care of. In the end, these machines just transformed the work to be done, but didn’t reduce it. With increasing use of genAI, we’re following in the same footsteps.
The two groups of actors that truly benefit from generative AI so far are 1) a dozen or so of big tech companies and 2) to a smaller degree, employers who might get some small efficiency gain out of their employees.
It’s the opposite of the (long debunked) Trickle Down Economics of yesteryear: The financial value bubbles to the very top, and only trace amounts of the value capture happens anywhere below.
That’s a real problem, a further structural erosion of the middle class, such as it were.
There used to be an expression in tech and management circles: above/below the API. If you worked above the API, you’d be telling the computer what to do, meaning your job was well-paid and relatively safe. If you worked below the API, you were instructed by a computer, for example as a gig worker. Your job was badly paid and you were considered replaceable.
With AI, we see the same dynamic, only that maybe the cut-off between above and below might be a few layers higher up the corporate rung.
If you work above the AI, you’re probably still doing well for a little bit, but it really depends on the type of work you do. Chances are that increasingly even middle management might find themselves in a situation where AI tells them what to do. It could happen even to white collar-type workers. The implications should be obvious: If an AI tells you what to do, you immediately become more replaceable, your work becomes less valuable. Only it could happen to new groups of people.
So, a double whammy of an economic problem: Value moves up the chain, and leaves a gaping hole in the lower hierarchical rungs; and the value capture is concentrated among a tiny, tiny, tiny number of stakeholders. This fully corresponds to recent pre-Davos reports about the overall redistribution of wealth globally that shows that wealth pools at the top, and pools there increasingly and increasingly fast.
This is how trust in the system is hollowed out. This is at the core of why democracy is retreating globally. It’s not the only reason, but it is a main one.