The Movie
There is a film on Amazon Prime called Mercy.
It is not a great film. The reviews will tell you that. Chris Pratt sits in a chair for 90 minutes and the screenplay fumbles most of what it accidentally stumbles onto.
But the premise is worth more than the film deserves.
It is 2029. Los Angeles. A justice system called Mercy (AI judge, AI prosecutor, AI executor) straps you to a chair and gives you 90 minutes to prove your innocence. The algorithm has already run its calculation. The guilt probability is displayed on a screen in front of you, ticking toward the threshold like a countdown. Below the line: you live. Above it: execution, right there, no appeal, no delay.
The film treats this as dystopia. A warning. A what-if.
I watched it as a forecast.
I don’t think that AI will literally execute people in 2029, but the underlying dynamic (i.e., a system that has already decided, and a human scrambling to prove something the system cannot measure) is not fiction. It is the trajectory we are already on. The latency is being removed. The human in the loop is being thinned out. And most people have not registered what that actually means for them personally.
They think AI is a tool they will use.
They have not considered that AI is a standard they will be measured against.
What Mercy gets right (accidentally).
The protagonist (a detective, ironically a former advocate of the Mercy system) sits in front of the algorithm and does what humans do when they are desperate. He calls people. He pulls data. He cross-references footage. He builds a logical case, piece by piece, from the information available.
He is doing exactly what the system does. Slower, less efficiently, with shaking hands.
And that is the point.
Every skill he deploys in that chair whether it is information retrieval, pattern matching, procedural reasoning, or logical argument construction, Mercy does better. That is not a plot hole. That is the film’s most honest moment, even if the screenplay never noticed it.
The only thing the algorithm cannot do is what the algorithm cannot do. Feel the weight of a life. Recognise something true that cannot be encoded. Know, without being able to say how, that something is wrong with the data.
In other words: the only thing that saves you in that chair is the one thing the compliance architecture spent twelve years removing.
Intuition.
Intuition?
Not intuition in the mystical sense. Not gut feeling as magical thinking.
Gigerenzer (the psychologist who spent a career rehabilitating intuition as a legitimate cognitive tool) defines it precisely: fast, frugal inference built from genuine experiential signal. Not a shortcut. Not a bias. A different kind of knowing. One that processes more variables than conscious reasoning can hold, arrives at conclusions before the reasoning catches up, and is often more accurate than deliberate analysis, provided the signal it was built on was clean.
Provided the signal was clean.
That is the condition everything else depends on.
Because intuition is not a fixed capacity. It is a calibrated instrument. And like any instrument, its accuracy depends entirely on what it was calibrated against.
A child who grows up in an environment that consistently validates their internal signal, who learns that what they feel, notice, and sense corresponds reliably to what is real, develops intuition calibrated against reality.
A child who grows up in an environment that consistently overrides their internal signal, who learns that what they feel is only valid when an adult confirms it, that what they notice is only relevant when the curriculum schedules it, that what they sense must wait for circle time, develops intuition calibrated against approval.
These are not the same instrument. They produce completely different humans. And only one of them has anything to offer in the chair.
This is what most people are missing about AI.
They are asking: which skills will survive? Which jobs will remain? How do I stay relevant in a world where the machine does more than I can?
These are the wrong questions. They are the questions of someone who still believes the competition is between human skills and AI skills — and is trying to figure out which human skills AI cannot yet replicate.
The real question is different.
The real question is: what are you, beneath your skills?
Because your skills are already being replicated. Not all of them, not immediately, not without friction, but the direction is not ambiguous. The trajectory is not reversible. Every skill rooted in information retrieval, pattern recognition, procedural execution, or rule-following application is on a curve that ends at redundancy.
What remains when you subtract the skills is not nothing. But it is only something if it was developed.
Volition. The capacity to know what you want, independent of what the system prompts you to want.
Judgment. The capacity to decide when the information is incomplete, ambiguous, and genuinely uncertain — without defaulting to a rule or an authority.
Intuition. The capacity to know something is wrong before you can articulate why. To sense the gap between what the data says and what is true.
These are not skills. They cannot be taught in a curriculum. They cannot be assessed on a rubric. They are the residue of a particular kind of development. One that requires unstructured time, genuine uncertainty, real consequences, and an interior life that was never systematically overridden.
They are also (not coincidentally) exactly what the Mercy system cannot measure.
The protagonist in the film survives. Hollywood requires it.
But the more interesting question is: what kind of person survives that chair in reality? Not in the film’s logic, but in the actual logic of a world where AI systems make consequential decisions about human lives with increasing speed and decreasing human oversight.
The answer is not the most skilled person. Mercy out-skills everyone in the chair.
It is the person who knows something the system does not. Who can locate, in 90 minutes, the one true thing that the algorithm’s dataset does not contain, because it was never encoded, because it lives in the texture of a relationship, a moment, a felt sense of a person that no camera captured.
That capacity is not distributed evenly. It is the product of a specific developmental history. One where the interior signal was trusted early, cultivated consistently, and never systematically overridden in the name of compliance.
Most people sitting in that chair (yes, us included) were educated out of it before age seven.
That is not a metaphor.
That is the problem.
Next calibration: Saturday 10 A.M. GMT. Stay sharp.


