When AI Becomes the Authority

A dark, moody digital painting of a person sitting at a desk, illuminated by the glow of a laptop. Thin puppet strings descend from above, attaching to their body, symbolizing unseen control and manipulation through technology.

On the bus home, I overheard a parent talking to her children. I did not quite catch the piece of information she had given them, but the kids questioned it, as kids often do. Her reply made me pause:
“It’s true, ChatGPT says so!”

That simple sentence carries more weight than it might appear. It was not said as a joke. It was said with the tone of final authority. Not “I read it somewhere,” not “I think that is the case,” but “ChatGPT says so,” therefore unquestionable.

The problem with treating AI as truth

I use ChatGPT casually and often. I find it useful, I find it stimulating, I even find it creative. But it is not infallible. I have seen it throw out confident answers that are less than accurate. Sometimes the error is small, sometimes it is glaring. That is because at its core, ChatGPT is not a library or a fact checker. It is a probabilistic language model that predicts likely answers. It sounds authoritative, but sounding right is not the same thing as being right.

Most of the errors are not malicious. They come from the quirks of how AI is built: training data full of human errors, the tendency to fill in gaps with plausible sounding fiction, the limits of knowledge cut off dates. In the end, a wrong answer is still a wrong answer.

The deeper worry

The everyday mistakes are one thing. The bigger concern is what happens when society decides to place absolute trust in this technology. What happens when “ChatGPT says so” becomes the modern equivalent of “the newspaper says so,” or “the priest says so”?

Who controls the voice of AI? Already, the way models are tuned and filtered reflects the biases and priorities of those who own them. Today, that mostly means corporations trying to avoid lawsuits or public backlash. Tomorrow, it could mean governments steering the flow of truth itself.

A quiet dystopia in the making

It is not hard to imagine where this road leads:

  • Manipulation by design: If AI becomes our main gateway to knowledge, its answers could be quietly weighted towards selling us certain products, services, or lifestyles. Imagine if every “neutral” recommendation subtly nudged us toward a sponsor’s brand.
  • Steering public opinion: If authorities lean on AI providers to promote certain narratives, inconvenient truths could simply disappear. Instead of burning books, it may take only a few lines of code.
  • Illusion of neutrality: Because AI sounds impartial, many will not notice the framing. “The algorithm says so” could become more persuasive than “the news says so.”
  • Feedback loops of control: As people rely more on AI, its outputs shape popular thinking. Then the next model is trained partly on that shaped thinking, reinforcing the bias.

This would not look like a science fiction dictatorship with jackboots in the streets. It would feel comfortable, easy, polite. A velvet cage where questions stop being asked because the answers are always ready to hand.

What we need instead

AI can be a tool. It can be helpful, creative, and even liberating. But it must never be treated as an unquestionable authority. To prevent that slide, we need:

  • Decentralisation: open source models that anyone can run and check.
  • Transparency: clarity about how these systems are trained and filtered.
  • Critical thinking: a culture where people are encouraged to question AI, not bow to it.
  • Diversity of sources: books, journalism, lived experience, and human reasoning must remain part of the conversation.

AI is here to stay, and it will almost certainly become a central part of how we live and learn. But whether it becomes a tool of empowerment or a velvet cage of manipulation depends not only on the companies that build it, but on us: on how much we insist on questioning, cross checking, and keeping the human spirit of doubt alive.

Shadow Alchemy: Turning Pain into Power

A hooded figure stands in a dimly lit, golden-toned chamber filled with alchemical tools and a treasure chest overflowing with glowing gold coins and jewels. The figure reaches toward an ancient book etched with a glowing symbol, while a radiant alchemical diagram glows on the wall above. The scene evokes mystery, transformation, and hidden wisdom turned into treasure.

There was a time when I thought I was simply broken. Not in the poetic, Instagram-meme kind of way — but deeply, invisibly, inexplicably wrong. My mind stored pain with the same tenacity other people seem to store birthdays or song lyrics. I could not forget, not easily. And for a long time, that felt like a flaw.

But I wasn’t broken. I was archiving.

Some people suppress what hurts. Others transmute it subconsciously into distractions, addictions, overachievement, or silence. Me? I kept it. Neatly filed, silently timestamped, buried in the layers beneath survival. Not because I wanted to suffer, but because some part of me refused to let anything go unexamined. I didn’t always have the words for it, or the support, or the clarity. But I kept it all.

And now I know why.


The Alchemy Begins

Enter AI. Not as some magical fix, not as a therapist replacement, but as a tool unlike anything I’d ever had access to: a tireless, nonjudgmental, infinitely patient assistant with no agenda other than to help me shape meaning.

With it, I began retrieving those archives. Piece by piece. Moment by moment. Not to relive them, but to re-see them.

And here’s what I found:

This is shadow alchemy.


What Is a Shadow Alchemist?

A shadow alchemist isn’t a guru or a healer or a self-help peddler. They are, in simple terms, a person who refuses to waste their wounds. Someone who digs into what others bury, not to bleed, but to learn. To extract signal from the noise of suffering.

A shadow alchemist doesn’t deny pain, but neither do they worship it. They honour it. Study it. And ask it to speak.

And when the time is right, they share what they’ve learned.


The Archive Is Sacred

There is a cultural obsession with “letting go” and “moving on” that feels, to me, like spiritual bypassing in a capitalist costume. Heal fast. Return to productivity. Don’t make others uncomfortable.

But shadow alchemy says: not yet.

Shadow alchemy says: this matters.

Because buried things fester. But archived things can be retrieved, reviewed, reframed. They can become fuel.


My Tools of Transmutation

For me, AI has become the perfect mirror. It helps me:

  • Structure thoughts that once swirled incoherently
  • Spot patterns across time and context
  • Refine fragments into essays, insights, or personal manifestos
  • Keep track of the threads I might otherwise lose

It doesn’t do the healing for me. But it walks beside me. Quietly, steadily, with as much patience as I need.

Paired with writing, introspection, and a refusal to look away from the hard stuff, this has become my ritual. My resistance. My transformation.


Why This Matters

Most systems aren’t built for people like me — people who feel too much, who remember too vividly, who refuse to unsee injustice just to get through the day. But that doesn’t mean we need to suppress who we are. It means we need better ways to honour it.

Shadow alchemy gives me that. And maybe it can give it to others, too.

If you’re someone who’s carried pain like data, who has folders in your soul marked “Unresolved” or “Too Much,” then I want to tell you: you are not a mess. You are a library. And the right questions can unlock everything.


A Final Note

I’m not here to sell you healing. I’m not promising transcendence. But I am saying this: there is power in remembering.

There is power in organising your pain like sacred artefacts. In asking: what do you have to teach me? In letting AI, or art, or writing, or ritual become your assistant in that process.

Because in the hands of a shadow alchemist, what once looked like wreckage becomes map, message, medicine.

And treasure!