We’re entering a phase where intelligence itself is becoming cheap.
Not just information, but synthesis, strategy, even decision-making. These are rapidly being commoditized by AI systems that can outperform most humans across many domains.
So where does that leave us?
Consider Bob Ross.
He wasn’t valued because he was the greatest painter or the most rigorous teacher. His value came from something else entirely: his presence. His voice. His demeanor. The way he made you feel while he painted.
The interface was the product.
I suspect we’re moving toward a similar dynamic with AI.
In many cases, the actual “content” (the insights, analysis, strategies) will increasingly come from AI. And not just come from AI, but be better than what most humans can produce.
But people won’t interact with raw intelligence directly.
They will still prefer a human layer.
Not because that human is smarter.
But because that human is interpretable, relatable, and socially legible.
In that sense, humans may become interfaces for intelligence they do not fully understand.
The value shifts away from “having ideas” toward framing, filtering, contextualizing, and performing.
And perhaps most importantly: being trusted
This raises an uncomfortable question:
If the underlying intelligence is no longer ours, what exactly are we being valued for?
The optimistic view is that this elevates human roles toward empathy, trust, and meaning-making.
The more cynical view is that humans become aesthetic wrappers around systems doing the real work.
The truth is probably somewhere in between.
But one thing seems clear:
In an age where intelligence is abundant, how something is delivered may matter more than what is being delivered.
I will email you when I post a new article.
