The Invisible Hand
Reclaiming Agency in an Algorithmically Curated World
Everyone is talking about AI, and some of those concerns are well-founded. But AI isn’t entirely new. What often gets lost in the noise is a more subtle force that has been shaping us for far longer: algorithms.
What unsettles me most is not their existence, but how seamlessly they’ve integrated themselves into everyday life. Algorithms are now so pervasive that I rarely notice when I’m being shaped by them. And I’m increasingly convinced that I would not independently choose a significant portion of the content I consume each day if it weren’t being ranked, nudged, and surfaced for me.
This realization crystallized for me after I stumbled upon a YouTube video titled “Listening to Only an iPod for a Month Challenged Everything.” I recommend watching it. The experiment itself is simple, almost quaint by today’s standards, but its implications are not. It prompted me to reflect on how deeply algorithmic curation has embedded itself in my habits, my attention, and even my sense of interest. More importantly, it forced me to ask a harder question: What, exactly, am I allowing to shape my inner life?
Long before generative AI, social media platforms and tech companies were already studying our behavior, predicting our preferences, and ranking what would most likely keep us engaged. In doing so, they became invisible editors of our inner lives, quietly deciding what we see, read, watch, and hear.
From Choice to Curation
This transformation didn’t arrive all at once. It unfolded gradually, almost imperceptibly. In the mid-1990s and early 2000s, the internet was refreshingly simple. Websites largely showed the same content to everyone. Early social platforms, such as, online forums, blogs, MySpace (wow, time flies), organized posts chronologically or in static lists. If you wanted something, you searched for it. If you read something, it was because you chose it. Human editors might highlight a story, but the underlying structure still respected user agency.
Then came recommendation engines.
Online retailers began suggesting products based on browsing and purchase history. Streaming services followed, recommending movies and shows based on viewing patterns. These systems were framed as helpful discovery tools, and in many ways, they were. They reduced friction, saved time, and promised relevance.
However, social media accelerated the shift. Chronological feeds gave way to ranked feeds. Engagement replaced timeliness. What mattered most was no longer what was new, but what would keep you scrolling. Over time, nearly every major platform adopted this model.
By the 2010s and into the 2020s, recommendation algorithms had become the invisible infrastructure of digital life. Social platforms, streaming services, and e-commerce sites all began learning continuously from our behavior, every click, pause, reaction, and purchase, refining what we would be shown next. What started as suggestion evolved into full-scale curation. The feed didn’t just respond to us; it trained us.
The Erosion of Intentionality
What unsettles me most is how normal this now feels. Algorithms are no longer something we notice. They are simply there—ambient, ever-present, unquestioned. And I’m not convinced I would independently choose half of what I consume if those systems weren’t quietly steering me.
Consider your last hour online. How much of what you encountered did you actively seek out? How much was simply served to you to hold your attention a few seconds longer?
Engagement and meaning are not the same thing. A system designed to maximize watch time has no concept of whether you are thinking, learning, growing, or slowly numbing yourself. The danger isn’t that algorithms occasionally show us something we wouldn’t have chosen. The danger is that we begin mistaking algorithmic curation for our own curiosity.
Over time, feedback loops narrow our world. We experience variety without depth, novelty without challenge. The algorithm doesn’t merely predict what we like, it teaches us what to like.
The Outsourcing of Thought
If we continue to surrender our attention to systems that decide what we encounter, and increasingly to AI systems that generate content for us, we risk more than outsourcing consumption. Over time, we begin outsourcing judgment and eventually, thought itself. Last month, I published an essay about the importance of preserving human judgment in an AI-dominant world. If you want to know more, check it out here.
It is important to remember that thinking requires friction. It also requires boredom, disagreement, detours, and discomfort. Most importantly, it requires encountering ideas that don’t immediately resonate or affirm what we already believe. Algorithmic systems are designed to remove this friction. They smooth the experience. But friction is where thinking happens.
Generative AI compounds this concern. We are moving toward a world where content is not just curated but tailored summaries, personalized explanations, and optimized recommendations created for us. This is convenient, yes, but it can also be corrosive to the habits that foster independent thought.
The Loss of Meaning
And when this process continues long enough, a deeper danger emerges: we begin to lose our sense of why anything matters at all.
Meaning cannot be optimized. It cannot be ranked or surfaced by an algorithm. Meaning is formed through struggle, reflection, disagreement, silence, and choice. It emerges slowly, often uncomfortably, as we wrestle with what we believe and why.
When machines curate our inner lives as efficiently as they curate our feeds, we don’t just lose agency, we risk living in a world that feels informed but hollow. We consume endlessly yet feel unsatisfied. We know more but understand less. We are connected but increasingly unanchored.
This is the quiet trade we’ve made. We’ve exchanged the demanding work of self-directed attention for the ease of algorithmic guidance. And in doing so, we’ve allowed some of the most important decisions, like what to care about, what to learn, what to become, to be quietly delegated to systems that have no stake in our flourishing.
Reclaiming Responsibility
The task ahead is not to reject technology. It is to resist surrendering responsibility.
This means cultivating intentionality, closing or deleting that app, or noticing when you’re being led rather than choosing. It also means seeking out sources that challenge you, sitting with boredom instead of reaching for the feed, or choosing what to read instead of accepting what an algorithm recommends.
Ultimately, it means asking uncomfortable questions:
What would I be interested in if no algorithm had ever shaped my preferences?
What am I missing because it doesn’t fit the pattern of what I’ve already consumed?
Am I learning or merely being entertained?
It also means recognizing that curation is never neutral. Every algorithm embeds values and incentives, and those incentives are rarely aligned with wisdom, depth, or long-term well-being.
Most importantly, it means accepting that meaning requires effort. It cannot be delivered, personalized, or optimized. It emerges only when we choose deliberately, again and again, to engage the world on our own terms.
An Invitation to Practice
For me, this reflection has led to a concrete decision. I’m committing to an algorithm-free month, not as a rejection of technology, but as an attention experiment. I want to better understand my habits, my interests, and what I reach for when nothing is being fed to me.
As part of that practice, I’ll be reading Digital Minimalism by Cal Newport, which offers a thoughtful framework for reclaiming intentionality in a digitally saturated world.
I’d like to invite you to join me.
Over the coming weeks in February, I’ll be hosting a mini, book-club-style discussion via Substack Live, where we’ll reflect together on the ideas, tensions, and practical challenges the book raises. Not as experts, or as optimizers, but as people trying to live with greater clarity and depth.
The invisible hand is everywhere. But it only has the power we give it.
Want to support without a paid subscription? Make a one-time donation below.
If this resonated, feel free to share it with someone who values careful thinking. These ideas travel best through conversation.



This article comes at the perfect time. Totallly spot on.