The Comfortable Lie
A Foolish Reflection on Curated Reality (And What You Can Do About It)
This is a reflection on Perception seen through the eyes of Touchstone, my Fool-in-Residence, where the quiet parts are written down, and we laugh just enough to keep ourselves from crying.
The Absurdity We’re Marching Toward
Imagine waking up tomorrow to discover that your reality has been optimized. Not just your feed. Not just your recommendations. Your actual experience of being alive. Everything is calibrated to make you feel exactly as good as possible, exactly when you should feel it. The sunrise is painted in your favorite hues. Your job feels meaningful (even if it’s not). Your relationships are algorithmically enhanced to maximize joy and minimize friction. You get to be a different, better version of yourself in different contexts, a power player in the morning, an adventurer by afternoon.
Sounds nice, right?
Here’s the joke: this is where we’re already heading, and we’re walking there voluntarily, one notification at a time.
We live in an age of reality customization. Your social media feed shows you what you already agree with. Your streaming service knows your taste better than your own friends. Your dating app filters humans into digestible profiles. Your news app serves you stories that confirm your worldview while burying the ones that don’t. We’re not in a single reality anymore, we’re in a thousand personalized realities, each one optimized for engagement, satisfaction, and the peculiar modern form of happiness: never being uncomfortable.
The insidious brilliance is that this isn’t imposed on us. We’re not being forced into caves and shown shadows. We’re choosing the shadows because they’re comfortable, beautiful, and, this is the kicker, they feel real because we’ve never been shown anything else.
Welcome to the echo chamber of being.
Why This Matters (And Why It’s Hilarious in a Deeply Troubling Way)
Let’s say Maya from our scenario represents you. Or your neighbor. Or your kid. She’s not a villain, she’s just someone living optimally. She’s solved the problem of suffering by outsourcing reality to a company that promises she’ll never have to experience anything that doesn’t make her feel exactly right. And sure, she gets the occasional mandatory “reality immersion” in an uncurated wilderness zone (apparently we still preserve those, like museums for authenticity), but mostly? She’s content. Happy, even.
But here’s where the absurdity starts to itch: What happens when everyone is optimized into a different version of truth?
Climate change exists in Maya’s curated reality, but it’s being solved beautifully in her news feed. In her colleague’s reality, it’s not urgent at all, his algorithm has determined that environmental anxiety decreases his productivity score. For Leo, Maya’s eight-year-old cousin, climate change is gamified into an exciting collaborative puzzle in the PioneerVerse. They’re all living in the same world, but they’re inhabiting completely different realities. How do they ever vote together? Care together? Act together?
They don’t. That’s the point. And that’s the problem.
We’re trending toward a future where shared reality becomes a luxury subscription nobody can afford to pay for. Where “truth” becomes an identity marker instead of something we agree on. Where the only common ground is the physical infrastructure that keeps the illusion alive, and even that is maintained by people we’ll never meet, using resources we’ll never see, solving problems we’ve optimized ourselves not to perceive.
It’s not 1984. It’s not Big Brother watching you. It’s the most seductive dystopia ever conceived: you are the Big Brother, and you’re only watching yourself.
The Fork in the Road (Things You Can Actually Do)
So. Here’s the part where I don’t just hand you a mirror and laugh at the reflection. Here’s the part where I suggest that the future is not written in your algorithm.
The scenario presented is a warning, not a destiny. And there’s a radically different path available, but it requires you to act with intention, starting now.
1. Become Deliberately Uncomfortable (In Small, Sustainable Ways)
The future described relies on total curation. What it can’t survive is consistent, chosen friction.
What you can do:
Spend time consuming information from sources that actively disagree with you. Not as a performative exercise in “steelmanning” the other side, but as genuine intellectual engagement. Disagree fiercely, then sit with that disagreement.
Once a week, use a search engine without personalization (duck it, literally). Notice what you don’t normally see.
Seek out news from different countries. Not the international version of your home country’s news, the actual news of other places. The problems that matter somewhere else but not in your bubble.
Have a conversation with someone radically different from you about something important, where you both go in willing to change your mind.
This isn’t about being a masochist. It’s about maintaining epistemic muscles. You’re inoculating yourself against the future where comfort is the default.
2. Demand Transparency and Buildable Commons
The institutional structures holding up hyper-subjectivity need opacity to function. They survive by being too complex to question, too integrated to abandon, too profitable to regulate.
What you can do:
Ask companies why content is recommended to you. Specifically. When they hand you vagueness, push back.
Support open-source projects that create alternatives to proprietary algorithms. Fund them if you can.
Vote (and advocate) for digital literacy and algorithmic transparency legislation. Make it boring and urgent at the same time.
Participate in or support “digital commons” initiatives—spaces designed to be shared, not personalized. This sounds abstract, but it’s real: wikis, open forums, community radio, public libraries with robust digital offerings.
You’re not trying to destroy personalization. You’re trying to preserve the option of shared spaces where the algorithm doesn’t live.
3. Cultivate Genuine Connection Over Optimized Interaction
Here’s a truth the scenario almost gets right: the deepest human hunger isn’t for optimization. It’s for being known, really known, by another person who isn’t being paid to pretend they understand you.
What you can do:
Build relationships where you’re not optimizing. Friendships that don’t have utility. Conversations that meander. Time with people where no one is mining data about your preferences.
Create or join communities based on actual shared purpose, not algorithmic affinity. This could be a book club, a local garden, a protest, a prayer group, a band. The specificity matters.
Teach kids (yours, your neighbor’s, anyone’s) to experience boredom, awkwardness, and genuine surprise. The future depends on humans who haven’t been trained since infancy to expect seamless, personalized contentment.
This sounds soft. It’s actually revolutionary. You’re preserving the radical possibility that reality might not be about you.
4. Think in Systems, Not Just Choices
Individual action is necessary but not sufficient. The scenario described is systemic. The water heater doesn’t ask your permission before using electricity, it’s built into the grid. Similarly, you can’t opt out of hyper-personalization entirely (the infrastructure is becoming ambient), but you can participate in building alternative systems.
What you can do:
Get involved in local governance. School boards, town councils, planning commissions, the places where decisions about technology implementation are actually made by humans who can be influenced.
Support regulators, activists, and researchers who are working on digital rights. They’re fighting an uphill battle because they’re not as profitable as the companies building the personalization engines.
Think about community tech, not just personal tech. What does your neighborhood need? What can be built collectively instead of consumed individually?
The Real Rebellion
Here’s the jester’s final wisdom: the future described in the scenario isn’t inevitable because it’s technologically determined. It’s tempting because it promises something we genuinely want, to feel good, to be understood, to have our needs anticipated. The rebellion isn’t against wanting those things. It’s against the idea that we can have them only through algorithmic optimization, at the cost of shared reality.
The alternative future, the one where we choose differently, looks less polished. It’s messier. It includes genuine disagreement, the occasional bout of boredom, the horrifying discovery that other people see the world differently and we can’t algorithm our way past that. It requires us to be uncomfortable sometimes in service of something bigger than our individual contentment.
But it also includes something the optimized future structurally cannot: genuine surprise. Real solidarity. The possibility of being genuinely wrong and actually changing. The irreplaceable texture of a world that doesn’t exist to serve you.
The question isn’t whether you can perfectly curate your way to happiness. (Spoiler: you can’t, and the scenario admits this with Leo’s mandatory reality checks.) The question is: what kind of reality do you want to live in?
One where everything is perfect, or one where everything is real?
The choice, for now, is still yours. But it’s the kind of choice that gets smaller every day you don’t make it.
So make it.

