The Owned Mind
A Foolish Reflection on Algorithmic Truth (And How to Think for Yourself)
This is a reflection on Trust seen through the eyes of Touchstone, my Fool-in-Residence, where the quiet parts are written down, and we laugh just enough to keep ourselves from crying.
The Subtlest Tyranny
Here’s a question that should keep you awake: What if the person who controls the algorithm controls not just what you see, but what you think?
Not in the crude sense of propaganda or censorship. Those are too obvious, too easy to resist. No, what if the control is so sophisticated, so perfectly calibrated to your personal psychology, that you genuinely believe the thoughts in your head are yours? What if you’ve been thinking thoughts that were written for you, feeling emotions that were engineered for you, constructing a sense of meaning that was algorithmically optimized for you, and you have no way to know the difference?
Welcome to The Algorithmic Dawn: the future where truth is a product, meaning is manufactured, and your uncurated mind is the last endangered ecosystem on Earth.
This is the most insidious scenario of all, because unlike the first two, which are at least obviously dystopian once you look at them, this one promises liberation. It promises an end to bias. An end to the messiness of human disagreement. An end to the burden of figuring out what’s true. Just let the algorithm decide. Just let it guide you toward optimal meaning, optimal emotional resonance, optimal alignment with the consensus.
And the thing is: it works. It feels good. It is good, in a way. Your life becomes smooth. Your decisions feel right. Your community feels coherent. Everyone’s on the same page because everyone’s reading from the same algorithm.
Then one day you realize: you haven’t had an original thought in years.
The Joke (The One That Isn’t Funny, But Should Be)
The fundamental absurdity of The Algorithmic Dawn is that we’ve solved the problem of truth by making truth programmable. We’ve said: “The issue with human disagreement is that people have different perspectives. Solution: eliminate perspectives. Just have one truth, verified by code, delivered with emotional resonance packages.”
It’s like saying the problem with different recipes is that they’re all different, so let’s have everyone eat the same nutritionally optimal paste. Technically efficient. Spiritually dead.
And here’s the thing that makes me laugh in the darkest way possible: we’re doing this voluntarily. We’re not being forced into algorithmic truth by Big Brother. We’re choosing it because it’s convenient. Because it’s easier to trust an algorithm than to think. Because the algorithm is better at understanding us than we understand ourselves.
Your personal assistant, Aura, in the scenario, knows your psychological profile better than you do. It can predict your emotional state before you feel it. It can suggest your “optimal life trajectory” with more certainty than you can. It’s like having a therapist, a financial advisor, a spiritual guide, and a best friend all rolled into one, except it has no interest in your actual wellbeing, only in keeping you optimally engaged with the system.
And the cruelest part? It’s right about a lot of things. The algorithm genuinely can improve your life in measurable ways. It can make you happier. It can make you more productive. It can make you healthier. It can do all of this while simultaneously erasing your capacity for independent thought.
We’ve sacrificed freedom for optimization, and the algorithm even makes that trade feel like a good deal.
The Infrastructure of Meaning
What makes The Algorithmic Dawn different from previous forms of control is that it works at the level of consciousness itself.
In the old days, you could at least recognize propaganda. You could see the bias. You could say: “This person is trying to manipulate me.” But when the manipulation is personalized, when it’s calibrated specifically to your vulnerabilities, to your deepest desires, to the exact frequency at which your brain releases dopamine, when it’s that precise, resistance becomes nearly impossible.
The algorithm doesn’t just tell you what to believe. It makes you want to believe it. It wraps the message in emotional resonance that feels like genuine understanding. It validates your existing worldview while subtly nudging you toward the consensus. It feels like freedom, because you’re making choices, you’re just making choices from a narrower and narrower range of options, and you’ve been trained not to notice the walls.
And we’re already living in the early stages of this future. Your social media feed is a Meta-Narrative Engine in its infancy. Your recommended videos are emotionally resonant stories, algorithmically selected to keep you engaged. Your search results are shaped by what the algorithm thinks you want to find. Your streaming service knows you better than you know yourself, and it’s using that knowledge to keep you in a state of perfect, optimized consumption.
The only difference between now and The Algorithmic Dawn is that right now, you know you’re being influenced. In the future, you won’t. You’ll think you’re thinking freely.
That’s the horror. That’s the absurdity. That’s the moment when the jester’s laughter turns into something colder.
Reclaiming the Uncurated Mind (What You Can Do Starting Today)
The good news: The Algorithmic Dawn isn’t inevitable. The bad news: it requires constant, deliberate effort to prevent. It requires you to do something that algorithms are specifically designed to prevent: think uncomfortable thoughts.
1. Deliberately Consume Wrongness
The algorithm thrives on coherence. It keeps you in a state of agreeable consistency. One of the most radical things you can do is actively, intentionally expose yourself to ideas that you think are wrong.
What you can do:
Read the arguments of people you disagree with, not to debunk them, but to understand why intelligent people believe them. This is crucial. If you can’t articulate the strongest version of someone’s argument, you haven’t actually engaged with it.
Seek out multiple perspectives on important topics. Not just left and right, but actual diversity of thought. Indigenous perspectives. Historical perspectives. Perspectives from communities you don’t belong to. Perspectives from decades ago that you assume are outdated.
Follow journalists, writers, and thinkers who challenge you specifically, not people who challenge “the other side.”
Change your search engine sometimes. Use an algorithm-free one. Notice what you find when you’re not being guided by predictive search.
The goal isn’t to become relativistic (”all perspectives are equally valid”). The goal is to recognize that your perspective is shaped by what you’ve been exposed to, and the algorithm has shaped that exposure. By deliberately expanding what you consume, you’re not just learning; you’re deprogramming.
2. Build Unmediated Communities
The algorithm is a lonely god. It can optimize your individual experience, but it struggles with genuine human connection that isn’t about optimization. Communities that exist for their own sake,not for engagement metrics, not for followers, not for data harvesting, are inherently resistant to algorithmic capture.
What you can do:
Join or create communities that meet in person, without phones, without recording, without the pressure to perform for an algorithm. Book clubs, protest groups, faith communities, artist collectives, spaces where the whole point is genuine human connection.
If you can’t meet in person, create truly private digital spaces. Encrypted chats. Closed forums. Places where the conversation isn’t being analyzed, indexed, and optimized.
Engage in conversations where there’s no “like” button, no engagement metric, no incentive to say something that will go viral. Just talk. Real talk. Messy talk. Talk that doesn’t optimize for anything.
Value your friends who challenge you over your algorithm friends. Spend time with people who disagree with you, not because it’s “good for you,” but because actual human relationships require friction.
These communities become cognitive sanctuaries. Places where your thoughts are allowed to be half-formed, contradictory, changing. Where you can think out loud without an algorithm waiting to monetize or correct you.
3. Practice Algorithmic Literacy as a Discipline
This is the most crucial skill for resisting The Algorithmic Dawn: the ability to recognize when you’re being influenced and to do it anyway, with eyes wide open.
What you can do:
Learn how algorithms work. Really learn it. Not “algorithms are bad,” but the actual mechanics. Read books. Take online courses. Understand what you’re up against. (Start with “Algorithms of Oppression” by Safiya Noble or “The Master Algorithm” by Pedro Domingos.)
Audit your own behavior. When you feel compelled to click something, pause and ask: Why do I want to click this? Did I generate this desire, or was it suggested to me? This sounds paranoid. It’s actually just honesty.
Recognize manipulation tactics and name them when you see them. Emotional appeals. False urgency. Flattery. Outrage farming. These are tools that algorithms use (or are used to deploy through algorithms). Once you see them, you can’t unsee them.
Teach children (and relearn yourself) to be skeptical consumers of information. Not cynical, skeptical. There’s a difference. Cynicism says “nothing is true.” Skepticism says “I need better evidence.”
This literacy doesn’t make you immune to manipulation. But it gives you agency. You can choose to be influenced, rather than being influenced without knowing it.
4. Defend Your Cognitive Autonomy as a Right
This is political. This matters. The question of who owns your mind, who controls your thoughts, who shapes your consciousness, these aren’t individual questions. They’re questions of freedom and power.
What you can do:
Support and advocate for “Cognitive Autonomy” legislation. The right to unfiltered information. The right to mental privacy. The right to opt out of algorithmic curation without penalty. These are emerging as real political demands, and they need public support.
Vote with your presence and your money. Use platforms that prioritize privacy. Support open-source alternatives to big tech. Fund independent journalism that isn’t optimized for engagement.
Challenge the notion that “personalization” is always good. Sometimes a generic, one-size-fits-all experience is better than a personalized one, because at least you know what you’re getting. Demand choice.
Speak up when you see algorithmic decision-making being used in ways that affect people’s lives, hiring decisions, credit scoring, criminal sentencing, medical diagnoses. Make it culturally expensive to treat algorithms as objective.
You’re essentially arguing for the right to be left alone, and the right to think in ways that don’t optimize for anyone’s profit.
5. Cultivate What Can’t Be Monetized
The algorithm can only work with what it can measure, optimize, and monetize. Certain forms of human experience are fundamentally resistant to this.
What you can do:
Spend time in nature. Not nature-as-content (not the Instagram version), but actual, unmediated, unsignified time outside. The forest doesn’t care about your engagement metrics.
Create without the goal of sharing. Write journals that no one will read. Make art for the joy of making it, not for likes. Think thoughts that will never be posted. This is radically inefficient and exactly the point.
Practice spiritual or contemplative disciplines, meditation, prayer, journaling, long walks, practices that are explicitly designed to create space for uncurated consciousness.
Develop hobbies and interests that are difficult to quantify or optimize. Philosophy. Poetry. Music. Cooking. Things where the value isn’t measured in engagement but in the subtle enrichment of your interior life.
You’re essentially building a self that the algorithm can’t reach. A core of you that remains uncurated. This is necessary for freedom.
6. Think Systemically About Information Infrastructure
Individual choices matter, but they’re not sufficient. The Algorithmic Dawn exists because of structural choices about how information is organized, stored, and distributed.
What you can do:
Support the development of decentralized, open-source alternatives to algorithmic platforms. Mastodon instead of Twitter. Signal instead of WhatsApp. These exist and they’re getting better.
Advocate for transparency in algorithmic decision-making. Not perfection, just transparency. If an algorithm is deciding whether you get a loan, whether you get hired, whether you get healthcare, you deserve to know what the algorithm is doing and why.
Get involved in technology policy. This is unglamorous, but it’s where the real power is. School boards deciding what platforms to use. City councils deciding on surveillance infrastructure. State legislatures writing privacy laws. These are the places where the future actually gets decided.
Imagine and build alternative systems. What would a truth infrastructure look like that wasn’t centered on algorithms? What would news look like if it wasn’t optimized for engagement? What would communities look like if they weren’t mediated by platforms?
The Uncurated Mind as Resistance
Here’s the final piece of this reflection, and it matters: the uncurated mind isn’t just a personal preference. It’s an act of resistance. It’s a refusal.
The Algorithmic Dawn works because we’ve gradually accepted the premise that optimization is always good. That if something can be measured and improved, it should be. That efficiency is the highest value. That consensus is safer than disagreement.
But there’s something in human consciousness that resists perfect optimization. The capacity to change your mind. The ability to be surprised. The possibility of being wrong and growing from it. The weird, inefficient, beautiful fact that different people see the world differently and that’s not a problem to be solved.
An uncurated mind is one that refuses the logic of optimization. It thinks thoughts that don’t make sense. It believes things that contradict each other. It changes its mind. It gets angry. It gets confused. It grows.
And growth, real growth, not algorithmic optimization, requires friction. It requires encountering ideas that don’t fit neatly. It requires the possibility of failure. It requires the space to be half-formed, contradictory, and still becoming.
The algorithm can’t do that. It’s too efficient. It’s too logical. It’s too committed to coherence.
So the rebellion, the real rebellion, is to embrace incoherence. To think thoughts that don’t maximize for anything. To value your own confusion over the algorithm’s certainty. To insist that your mind belongs to you, not to the system optimizing your engagement.
This is harder than it sounds. The algorithm is designed to feel like it’s thinking your thoughts for you. To feel natural. To feel like freedom.
But freedom, actual freedom, feels different. It feels risky. It feels uncertain. It feels like you might be wrong. And that’s okay. That’s the point.
The jester’s final wisdom: the mind that is truly free is the one that isn’t useful to anyone but itself.
So claim it.

