I keep circling back to a thought experiment that refuses to stay in the past. It starts with Pierre-Simon Laplace, an 18th-century French scientist, imagining a kind of mind so comprehensive it feels almost obscene. Give this intelligence the position of every particle in the universe and all the laws governing them, he said, and the rest follows. Tomorrow’s weather, next year’s wars, even the private flicker of a thought decades from now—everything would be laid out, inevitable, already implicit in the present. A clockwork cosmos. No gaps, no surprises. Just calculation.
There’s something clean about it. Too clean.
It presses on a place that doesn’t quite want to be pressed. If every event is caused by previous events in a chain that never breaks, then the feeling of choosing—of pausing, weighing, deciding—starts to look like a kind of ornament. A story the brain tells itself while the machinery keeps moving. I picture dominoes falling, except the first one toppled long before any of us arrived. Where does responsibility live in that picture? What happens to creativity, to the sense that something new can appear?
For a while, people argued over this as if the universe might tip one way or the other depending on how carefully the questions were phrased. Philosophers, theologians, all circling the same pressure point.
And then the cracks began to show.
The 20th century didn’t hand back certainty. It loosened it. Quantum mechanics arrived with a strange kind of honesty: at the smallest scales, nature doesn’t fully commit. There’s randomness baked in, not as a failure of measurement but as a property of the world itself. Even a perfect observer—Laplace’s imagined demon with its flawless data—would run into something irreducible. Not ignorance. Indeterminacy.
At the same time, mathematicians started studying systems that were, in principle, deterministic and still managed to slip through prediction’s fingers. Chaos theory gave us that unsettling image: a butterfly flapping its wings in Brazil might stir up a tornado in Texas. It sounds poetic until it lands as a technical claim. Tiny differences in initial conditions can amplify until the outcome becomes effectively unpredictable. The equations aren’t the problem. The sensitivity is.
So the universe stopped looking like a simple machine and started feeling like a tangled, dynamic system—one where knowing everything might still not be enough to say what happens next.

For a while, that felt like relief. A reprieve from inevitability.
But lately the shape of the question has changed rather than disappeared.
The “supreme intelligence” Laplace imagined didn’t arrive as a singular, transcendent mind. Instead, something more ordinary spread out everywhere. Algorithms. Not all-seeing, not perfect, but relentless. They run on mountains of data—our data—every search, every purchase, every second spent hovering over an image or scrolling past it. A different kind of lattice has taken form, one that doesn’t claim certainty but trades in probabilities.
It says: You will enjoy this product. You will click on that headline. You will pay back this loan.
And often, disconcertingly, it’s close.
There’s a particular moment I’ve started to notice. The instant before something plays automatically—the next episode queued before the credits have even finished. The sense of being nudged. Not forced. Just… guided. I keep watching. The line between choosing and continuing blurs in a way that doesn’t feel dramatic enough to resist. It doesn’t need to.
Or the way a navigation app suggests a route. I rarely pause to question it. There’s an implicit trust that somewhere, hidden from view, there’s more information than I have—traffic patterns, historical data, a calculation unfolding faster than I could manage. I turn where it tells me to turn, not because I’ve evaluated every alternative, but because it seems unreasonable not to follow the suggestion.
Bit by bit, decisions start to feel shared.
Not entirely mine. Not entirely someone else’s. A joint product, assembled quietly.
It’s tempting to dismiss this as benign. After all, nothing stops a different choice. The slower road is still there. The phone can be put down. Accounts can be deleted. In theory, nothing has been taken away.
But theory has a way of overstating how often resistance actually happens.
Once something is labeled “optimal,” it carries a kind of authority that’s hard to ignore. Once a feed is tailored precisely to existing preferences, stepping outside it begins to feel like work—unnecessary work, even. There’s a comfort in being understood by a system that knows exactly what to show next, what to recommend, what to hide.
And comfort accumulates.
I find myself wondering whether this is what a modern version of Laplace’s demon looks like—not a single intelligence predicting everything with perfect accuracy, but a vast network of smaller systems, each making probabilistic guesses that, taken together, shape outcomes in real time. Not predicting the future so much as gently arranging it.
The old question hasn’t disappeared. It’s just harder to spot because it’s woven into routines that feel ordinary.
How much of this is harmless efficiency, and how much quietly eats into whatever space is left for genuine choice? I don’t have a clean answer. The world isn’t the clock Laplace imagined, but it isn’t entirely open either. It’s something in between: structured enough to be nudged, uncertain enough to feel free, responsive enough to learn.
The unsettling part isn’t that everything might already be determined. It’s that influence now arrives disguised as convenience, and the path of least resistance keeps getting smoother.
I keep thinking about that, especially in the moments that don’t feel like decisions at all.






