Why I Don’t Use Social Media, Part III: From Expression to Exploitation

A mannequin-like human figure lit in cyan and magenta, with a barcode on its forehead and puppet strings above, standing in front of a faint web-grid background.

In Part II, I looked back at a time when online social spaces felt like rooms. Chat rooms, forums, and even MySpace had their own cultures, their own pace, and their own sense of community. Even when things weren’t perfect, it still felt like people were meeting each other on relatively human terms.

But somewhere along the timeline, the entire purpose of being online began to shift.

Connection started turning into visibility. Expression started turning into optimisation. People stopped showing up to talk, and started showing up to be seen. It’s hard to pinpoint exactly when this became the default, partly because it happened gradually, and partly because it was sold to us as progress.

But the end result is obvious.

“Social networking” didn’t simply evolve into “social media”. It was quietly redesigned into it. The language changed, the priorities changed, and most people barely noticed because it didn’t arrive as an announcement. It arrived as a new normal, shaped by platforms whose incentives were never neutral.


Memes and the Compression of Thought

Not long after Facebook became globally dominant, memes started becoming a major part of online culture. Memes, at least in the modern sense, are basically humour condensed into a single image, often paired with a caption. They spread quickly, they evolve through repetition, and they offer easy social hooks for people to build on.

Some memes are genuinely funny. Some are clever. Some are even oddly insightful in a compressed, sideways sort of way. But early on, I found myself uneasy with the format.

It wasn’t that I didn’t understand the humour. It wasn’t even that I disliked the jokes themselves. I just didn’t like what the format was doing to communication. It felt like conversation being flattened into a template, and personality being squeezed through pre-approved shapes.

I remember posting on Facebook when memes were first becoming common, basically asking if it was just me, or if they were making the world stupider. I wasn’t trying to be edgy, and I wasn’t trying to provoke anyone. I was reacting to something I found culturally strange and trying to put it into words.

I got shouted down instantly. My opinion was wildly unpopular.

Looking back, what stands out isn’t even the disagreement, it’s the speed and certainty of the backlash. It felt like a warning shot. Like certain kinds of critical thought were becoming socially unwelcome, especially if they threatened the fun, the vibe, or the collective agreement that keeps a platform frictionless.

Memes were not the worst thing to happen to the internet, but they were a sign of the direction things were moving in. Communication was being compressed, simplified, and optimised for rapid sharing rather than meaningful exchange.


When “Social Networking” Quietly Became “Social Media”

For a while, the shift was subtle. Different platforms emerged with different vibes, and people migrated depending on what suited them. Twitter was a big one, and to me it always looked like a fast-moving update stream, with a heavy focus on public figures and bite-sized commentary. I never signed up for it, not because I thought I was above it, but because I could already feel what kind of mental environment it would create for me.

Then came Instagram, and that was the point where the whole thing started to feel truly alien.

Instagram is social media built around images. On paper, I understand the appeal. People like visuals. People like documenting moments. People like sharing aesthetics. The problem isn’t the existence of photos, it’s what the platform encourages people to do with them.

The Instagram era felt like a cultural intensifier. It encouraged self-commodification, validation traps, and constant identity management centred around appearance. It rewarded the performance of a life, rather than the living of one. And it wasn’t just “look at this cool thing I did”, it increasingly became “look at me being the type of person who does cool things”.

I’ve seen people go to events purely to produce proof-of-attendance. Not to experience the event, but to capture the event. Not to have a night, but to extract a post.

And this is where the rebranding matters. “Social networking” didn’t shift into “social media” because people collectively decided they wanted something shallower. It became “social media” because platforms restructured online interaction into something that could be consumed, measured, targeted, and monetised. The design rewarded broadcasting over conversation, and performance over connection, and the culture followed the incentives.

Once that happened, the internet began to feel less like a network of people and more like a marketplace of identities.


The Artist Becomes a Product

As social media tightened its grip on everyday social life, it also tightened its grip on creative life. Somewhere along the way, there started becoming pressure on artists and musicians to promote themselves through social platforms, as if it was simply part of the job now.

If you wanted people to hear your music, you were expected to be present. To post constantly. To engage. To feed the algorithm. To build a “brand”. To become discoverable by playing the game.

I never successfully adopted that mindset, for a number of reasons. Partly because the workload felt absurd, especially for someone with ADHD and autism. The extra resources required to manage a constant online presence felt like an investment in something I was not good at, not interested in, and not capable of sustaining without burning out.

But more than that, it felt wrong.

I’m not comfortable pushing myself as a product. Even when I tried to treat it as “just marketing”, something in me resisted. It felt like I was being asked to flatten myself into something palatable and promotable, and then repeatedly present that version to the world until it started generating numbers.

At one point, my girlfriend even offered to manage my social media accounts for my music projects on my behalf, which was incredibly kind. But even then, it didn’t really work, because the problem wasn’t just the time commitment. The problem was what the whole process demanded of me.

It didn’t just want me to be an artist.

It wanted me to be a personality.

It wanted charisma as an output. It wanted a steady flow of “content” that was only partly related to the music itself. It wanted my life, my face, my social identity, my accessibility, my likeability, my relatability, all packaged into an ongoing performance alongside the creative work.

And if that is the price of visibility, then visibility starts to look less like opportunity and more like coercion.

I explored this more directly in my other articles The Independent Artist in the Age of Self Commodification and Art is NOT ‘Content’! because I don’t think this pressure is only damaging to artists. I think it is damaging to art itself. It changes what people make, how they present it, and what they feel allowed to be.


Going Viral as the New Goal

Somewhere along this timeline, and I struggle to pinpoint exactly when, “going viral” stopped being a rare accident and became a goal. Not just a possible path to attention, but the path that everyone was aiming for.

And when virality becomes the goal, content changes. Depth becomes a liability. Context becomes a burden. Nuance becomes inefficient. Intelligence becomes optional. The system rewards what spreads, not what’s true, not what’s meaningful, and not what’s human.

That’s why the rise of short-form video content, platforms like TikTok and everything that followed in its wake, felt like the endpoint of the process. I haven’t even bothered to explore it properly myself because I can already see what it represents from the outside. Communication reduced to bite-sized bursts designed to hijack attention, recycle trends, and keep people scrolling.

It all feels so shallow that it becomes hard to even call it “social” anymore.

To me, it looks like a world where people behave like idiots to entertain more idiots, because the reward structure is built around immediate reaction and maximum spread. And if that becomes the default route to success, it becomes obvious why culture itself starts to degrade. I’ve explored that wider attention economy more directly in Are You Paying Attention?, and the way platforms subtly train people out of nuance in Trained Not to Think: The Slow Death of Nuance.

At some point, the question stops being “Why don’t you use social media?”

And becomes “Why would you?”

Because once expression becomes exploitation, the healthiest move isn’t to optimise yourself for the system.

It’s to refuse the system entirely.

Why I Don’t Use Social Media, Part II: When Social Networking Was Actually Social

A cozy retro computer workstation with a glowing CRT monitor on a desk, surrounded by headphones and discs, set against a cyan and magenta web-grid background with a nostalgic, early-internet atmosphere.

In Part I, I talked about stepping away from social media without making a scene, and what I gained by simply stopping logging on. After that, I found myself thinking about why social media felt so draining in the first place. Part of it was personal, but part of it was historical. I remember an earlier version of the internet that felt more like a collection of places you chose to visit, rather than one endless feed you were expected to live inside.

There’s a particular kind of nostalgia people have for the early internet, and it’s often dismissed as the usual “back in my day” sentiment. The assumption is that if you miss it, you must be resisting change, failing to adapt, or romanticising a past that wasn’t as good as you remember.

But when I look back at the early years of social networking, the thing I miss isn’t the technology itself. It isn’t the graphics, or the clunky interfaces, or the fact that everything took longer. What I miss is the atmosphere. The texture. The feeling that the internet was made of smaller rooms rather than one giant stage.

Before “social media” became a thing, online socialising felt closer to real socialising. You turned up somewhere because you wanted to be there. You joined a space because you had something in common with the people inside it. You weren’t being funnelled into a feed and trained to scroll until your brain turned into static.

In a strange way, it felt more human.


Chat Rooms: Digital Rooms With Real People In Them

Some of my earliest experiences of social networking were in chat rooms, long before social media profiles became a default part of existing online. IRC, Yahoo! Chat, and other platforms like them were simple, almost blunt in design. You chose a topic, joined a room, and you were immediately dropped into a live conversation with strangers who were also there at that moment.

It sounds almost primitive now, but that simplicity was the entire point.

A chat room felt like walking into a pub or a cafe where you didn’t know anyone yet, but everyone had at least one shared interest. The rooms I found myself drawn to were usually related to computer games or music, and the magic was in the immediacy of it. People were talking because they were present. Conversations unfolded in real time. You’d meet strangers, then meet them again, and eventually they stopped being strangers.

And just like real social spaces, you’d get friendships and arguments, drama and in-jokes, and occasionally even romance, because human beings are human beings no matter what interface they happen to be using.

For me, as someone who didn’t have much of a real social life at the time, those spaces mattered. I was autistic, and face-to-face socialising often felt like walking through a minefield of invisible rules. Chat rooms felt safer. More manageable. More contained. There was still risk, because social interaction is always a risk, but it felt like a space where I could engage at my own pace without the same pressure of physical presence.

I genuinely loved those days.

Internet chat rooms still exist, and I even tried revisiting some of them years later out of curiosity. But the atmosphere was different, and so was the wider context. The people who once lived in those spaces had largely been pulled elsewhere.


Forums: Slower Conversation, Deeper Community

After chat rooms, another major era of online social networking came in the form of interest-based forums. These were often independent websites, running their own forum engines, with their own cultures, in-jokes, and long-standing community dynamics. If chat rooms were like pubs, forums were more like community halls with noticeboards, ongoing debates, and familiar faces.

The difference was tempo.

Chat was live and fast. Forums were slower, but that slowness was a strength. You could read a thread, think about it, come back later, and reply thoughtfully. You could contribute without needing to be “on” in the moment. That made discussions richer, and it also made them easier to engage with on your own terms.

Forums were also multi-purpose spaces. They weren’t just social hubs, they were information archives. If you wanted advice on something niche, you could ask a question and get answers from people who actually cared. Over time, the forum became a living reference library shaped by the interests of the people inside it.

Private messages mattered too. You could form genuine friendships through forums, and then take those friendships into quieter, more personal spaces. It wasn’t all public performance. There was a sense of layered social reality, where community existed in public threads, and connection could deepen privately.

Some of my strongest memories of that era include the Sinclair ZX Spectrum forum World of Spectrum, which felt like a gathering place for people who shared a specific kind of love for retro computing. I was also a member of a forum for a goth nightclub I used to frequent, Slimelight, which acted as an extension of the real-world scene. These weren’t faceless platforms. They were social worlds with their own gravity.

Looking back, what strikes me is that these spaces encouraged a kind of social continuity. People weren’t reduced to content, they were regular posters. Not followers, but familiar presences.


MySpace: Personality, Expression, and Casual Connection

Then came MySpace, and with it, something began to shift. It was still social networking, but it was more focused on the individual. MySpace gave you a space to customise, decorate, and shape. It wasn’t just a profile, it was a little public bedroom wall. A collage of taste and identity.

For me, the MySpace era felt like a bridge between my online persona and my real-life social world. It arrived during a time when the internet was becoming mainstream in a way it hadn’t been before. Suddenly, people who were never “computer people” were creating profiles, sharing their personality publicly, and expressing themselves online.

It was eye-opening, and it also made social connection feel oddly casual. People could add you, you could add them, and it didn’t necessarily feel like a desperate plea for approval. It felt more like saying hello across a room. You might not become close friends, but you were present in the same ecosystem.

It wasn’t perfect, and it had its own problems, but it still felt like something built around expression rather than optimisation. It felt messy in a human way.

This is also where I started noticing that the internet was becoming more centred on the individual, rather than purely on shared spaces. Not necessarily as a bad thing, but as a shift in emphasis.


The First Friction: Facebook and the Beginning of the Shift

Facebook entered my life while I was at university, and I still remember how strange its early rollout felt. For a while, it wasn’t even a global platform. It was introduced as something limited to university students, almost like it was trying to brand itself as a more “professional” or curated version of social networking.

What I remember most vividly is the social pressure. MySpace started being treated as outdated, and Facebook was framed as the thing everyone had to move to. It wasn’t presented as an optional new space, it was presented as the new default. If you wanted to maintain social contact online, you were expected to switch.

In its early years, Facebook wasn’t the worst, but I immediately felt a reduction in self-expression compared to MySpace. It was cleaner, more structured, and more uniform. Less personality. Less chaos. Less individuality. You were no longer decorating a space, you were occupying one, and the boundaries of that space weren’t yours to define.

In hindsight, this is where the larger shift began. Not necessarily because Facebook was uniquely evil, but because it marked the beginning of consolidation. Fewer independent rooms. Fewer niche communities with their own culture. More people funnelled into the same centralised ecosystem, following the same rules, shaped by the same interface.

Ironically, some of those “smaller rooms” did come back again later, but as part of the Facebook ecosystem itself. Things like Facebook Groups recreated the idea of smaller interest-based spaces, but the difference was that they weren’t independent anymore. They existed inside a single corporate environment with its own priorities, its own architecture, and its own invisible incentives.

The early internet felt like a collection of places you could choose to visit.

Facebook felt like a place you were expected to live.


Next: When Connection Became Content

This is where the timeline becomes less nostalgic and more unsettling. Because once social networking became centralised, and online identity became something you maintained publicly, the next step was almost inevitable.

In Part III, I’m going to explore what happened after this shift accelerated: the rise of memes and compressed communication, the arrival of image-based platforms like Instagram, the pressure on artists to become brands, and the moment “going viral” quietly transformed from an accident into a goal.

Why I Don’t Use Social Media, Part I: The Quiet Exit

A retro computer desk in a dark room with an open glowing doorway, cyan and magenta light spilling into mist, and faint notification icons drifting away across a web-like grid background.

For years now, I’ve watched the same social media ritual play out again and again.

Someone decides they’ve had enough. Social media is making them miserable, exhausting them, dragging them into drama, or pulling them into patterns that don’t feel healthy anymore. They write a long post explaining why they’re leaving, disable their accounts, and disappear.

Then, sooner or later, they come back.

Sometimes they quietly reactivate their old profile. Sometimes they make a new one and call it a fresh start. Sometimes they re-emerge without comment, as if nothing ever happened. It’s such a familiar pattern that it’s become a cultural joke.

I’m not judging those people. I understand it. Social media can genuinely damage your wellbeing, and sometimes a dramatic exit is the only way someone can break out of a cycle that feels addictive or emotionally chaotic.

But the reason I’ve titled this series Why I Don’t Use Social Media is partly because it echoes that familiar pattern in a slightly post-ironic way. The difference is that my truth isn’t that pattern at all.

I didn’t rage-quit. I didn’t announce my departure. I didn’t even delete my accounts.

I simply stopped logging on.


The Quiet Exit

If you ask most people what it means to “leave social media”, they picture something dramatic. Deleting apps. Deactivating accounts. Making a statement. Cutting people off. A digital detox that looks like a breakup.

For me, it wasn’t like that. I didn’t burn anything down, and I didn’t even choose a symbolic “last post”. I just stopped feeding it. I stopped posting. I stopped checking. I stopped scrolling. I unsubscribed from notifications, and stopped treating the feed like it had any claim on my attention.

At first, it didn’t feel like much. There was no cinematic moment of liberation. It was simply quieter, and then quieter again. The social urgency began to fade, and the pressure to keep up, stay visible, react, and maintain presence slowly lost its grip.


Feeds Aren’t Friendships

One of the strangest things about stepping away from social media is what it reveals about relationships.

A feed can create the illusion of connection. You know what people are doing, what they’re thinking, who they’re dating, what they ate, what they’re outraged about this week, what they’re celebrating, and what version of themselves they’re presenting. Even if you haven’t spoken to someone directly in months or years, the steady drip of updates can make it feel like they’re still part of your life.

But once you stop logging on, you realise how much of that closeness was simulated.

A lot of what we call “friendship” online is really just passive proximity. It’s not a relationship, it’s a feed. It’s information that creates the sensation of connection without the substance of it.

When that stream disappears, you see who still exists in your life outside the platform. The people that matter still show up. The people that matter still message you. The people that matter still tell you what’s going on, directly, in their own words, without filtering it through a public performance.

I don’t need to see what people are up to on a feed. The people that matter, tell me in person.


Connection vs Performance

Social media doesn’t only host social life. It trains it. The longer you stay in it, the more it reshapes your instincts about what social contact is supposed to look like.

It starts out as communication, but over time it becomes presence management. You’re not just living, you’re maintaining visibility. You’re not just sharing, you’re curating. Even if you’re not actively posting, you’re still living under the logic of an invisible audience.

There’s a psychological weight that comes with being constantly “seen”, or at least constantly available to be seen. For me, that weight never felt natural. It felt like a constant demand to be interpretable, presentable, and legible in a way that real life doesn’t require.

And this is the trap. Being seen is not the same thing as being known.

Logging off was the first time in years I realised how much of my social energy wasn’t being spent on relationships. It was being spent on being visible to the feed.


What I Gained (And What I Lost)

To be fair, stepping away from social media does come with trade-offs. You miss updates. You find out things later than everyone else. You lose that effortless sense of being “in the loop”. Sometimes you realise entire social circles now exist almost entirely inside an app, and if you’re not there, you are quietly absent.

Honestly, one of the biggest practical benefits I’ve lost is the news feed on releases and gigs from my favourite artists. Social media is very good at keeping you informed, especially in music scenes where things can be underground or informal.

But even that loss hasn’t been a disaster. In a strange way, it’s reclaimed some intentionality. Instead of being told that something exists, often framed as urgent, limited, and time-sensitive, I now tend to stumble upon a new release when I’m actively looking an artist up. It turns buying music back into something deliberate rather than reactive.

The same applies to events. Rather than being trapped in a constant churn of one-off nights that must be attended, I look up what’s on as and when I actually feel like doing something, and when resources allow.

And overall, what I gained was more valuable. Life became quieter. Less urgent. Less noisy. I had more mental space to focus on what I actually wanted to do, instead of having my attention constantly pulled sideways into other people’s broadcasts. I didn’t feel like I had to keep up with the constant churn of opinions, jokes, arguments, and declarations that social media rewards.

Over time, I also noticed something else. I felt less invested in friendships that were never really friendships in the first place. Not because I became cold or isolated, but because I could see more clearly what was real and what was just platform proximity.

I still keep in touch with people. I still care about people. I just don’t need a constant stream of updates to feel connected. The people who matter, and the relationships that matter, survive just fine outside the algorithm.

I’m not here to tell anyone else what they should do. Some people genuinely thrive on social media. Some people find community through it, and I don’t want to dismiss that. But it’s worth asking whether the platform is serving you, or whether you’re serving the platform.

Because if social media is making you feel anxious, drained, pressured, or performative, you don’t necessarily need a dramatic exit. You don’t have to rage-quit. You don’t have to announce it. You can simply stop feeding it.

And you might be surprised by how much of your life returns when you stop living part of it for an invisible audience.


Part of the reason I stepped away from social media is that I remember an earlier version of the internet, one that felt less like a stage and more like a collection of rooms. In Part II, I’m going to revisit those spaces, chat rooms, forums, and the MySpace era, and trace the point where connection began to shift into something more performative.

Favours: The Quiet Currency Beyond Money

A landscape illustration showing large chess pieces in the foreground on a reflective chessboard, with scattered gold coins nearby. In the background, well-dressed figures stand in small groups on a balcony overlooking a glowing city at dusk. Above them, a web of glowing lines and points connects across the sky, suggesting networks of influence and strategy.

Favours and Scarcity

Favours are usually thought of as small social gestures. Informal acts of help that smooth everyday life. Someone helps you out, and at some point you return the gesture. At this level, favours appear simple, even innocent. But they are not just social niceties. Favours function as a form of currency, and like any currency, their meaning depends entirely on what is scarce.

For most people, money is scarce. Time and energy are scarce too. In that context, favours tend to operate at the level of convenience. Helping someone move house, covering a shift, fixing something, saving a bit of money or effort. These favours matter because they substitute for resources people do not have. They are practical, mutual, and usually grounded in necessity.


The Wealth Threshold

Beyond a certain threshold of wealth, however, this dynamic changes. When someone has enough money to remove inconvenience by default, convenience based favours lose their value. Time can be bought. Labour can be hired. Problems can be outsourced. What was once helpful becomes irrelevant. A favour that merely saves effort or cost no longer carries weight.

This creates an asymmetry. A person with less wealth may still find such favours meaningful, while a person above the threshold has no need for them at all. From this point onward, favours stop being reciprocal in the ordinary sense. The familiar logic of “you scratch my back, I’ll scratch yours” begins to break down.


When Money Stops Working

Among those for whom money is no longer the limiting factor, favours do not disappear. Instead, they evolve. Their focus shifts away from convenience and towards things that money cannot reliably buy. Access to closed networks. Legitimacy in the eyes of the right people. Protection from scrutiny. Informal influence. Insider context. Strategic timing. Silence.

These favours are not about solving small problems. They shape outcomes. They influence which opportunities exist, which narratives take hold, and which consequences are softened or avoided altogether. This is no longer a social economy of help, but a power economy of positioning.

Money excels at purchasing goods and services, but it struggles in areas that are socially gated rather than commercially priced. Entry into certain rooms cannot be bought outright. Trust cannot be reliably purchased. Reputational legitimacy cannot be forced. Neither can immunity, discretion, or insider understanding. In these spaces, money loses its effectiveness, and favours take over as the dominant medium of exchange.


Favours as Strategy

At this level, favours resemble strategic moves rather than transactions. Money has a relatively fixed value. Favours have positional value. The comparison to chess is useful here. Pieces have nominal worth, but advantage matters more than price. A powerful actor may willingly sacrifice something expensive or visible if it secures a decisive future position. The value of a favour often lies not in immediate return, but in the obligation, alignment, or leverage it creates over time.


A Parallel Economy

What emerges is a quiet parallel economy operating alongside the monetary one. For most people, money governs survival. For the very wealthy, favours govern power. Money solves problems. Favours shape futures. Once money stops being scarce, favours take over as the currency that matters most, not because they are kinder or more human, but because they operate where money cannot.

Unseen Does Not Mean Unachieved

A lone figure stands on a rocky shoreline at dusk, looking across a calm lake at a small iceberg on the surface. Beneath the water, the iceberg extends far downward, vast and luminous, dwarfing what is visible above. Dark forests and distant mountains frame the quiet scene.

The Problem With How We Measure Success In Art

There is an assumption so deeply embedded in modern culture that it rarely gets questioned: that artistic success is proportional to visibility. If something is widely known, widely shared, or widely rewarded, we assume it must also be more accomplished, more skilled, or more meaningful.

At first glance, this seems reasonable. After all, how else are we meant to judge achievement?

But this assumption begins to unravel when we look more closely at what actually enables visibility.


Scale, Not Merit, Is The Great Divider

One of the biggest differences between globally recognised artists and those quietly working in obscurity is not necessarily talent, depth, or sincerity. It is scale.

Scale is enabled by money, time, infrastructure, and connections. These things allow work to be polished more quickly, distributed more widely, and sustained more reliably. They also allow technical skill to be developed faster, through access to better tools, education, mentorship, and collaborative environments.

This does not mean famous artists lack merit. Many clearly do not. But it does mean that merit is not the sole or even primary determinant of who gets seen.

Once scale is introduced, a feedback loop emerges. Money enables visibility. Visibility attracts more money. Momentum builds. At a certain point, success becomes self sustaining.

From this perspective, it becomes clear that some people cross a threshold where recognition is no longer uncertain. They are not guaranteed depth or meaning, but they are largely guaranteed presence.


Money As Both Metric And Mechanism

There is a deeper structural problem here.

Money is treated as a measure of success, but it is also the main tool used to manufacture success. This makes it a self influencing metric. The more of it you have, the more power you have to generate the appearance of success, which then justifies the metric itself.

In such a system, it is entirely possible for people to be effectively born successful, regardless of personal merit. Inherited wealth does not automatically produce shallow art, but it does remove many of the pressures that force confrontation with reality.

This matters, because confrontation often shapes meaning.


Struggle, Confrontation, And Depth

There is a reason so much resonant art emerges from difficulty. Struggle forces engagement with limits, loss, uncertainty, exclusion, and mortality. These experiences strip away abstraction and demand honesty.

That said, suffering alone does not create depth. What matters is integration. The ability to process experience consciously and transform it into form.

Some people suffer and are crushed by it. Others avoid it. Some repeat inherited patterns without reflection. Depth arises not from pain itself, but from how pain is metabolised.

Comfort, on the other hand, can insulate. It can make confrontation optional. This often leads to work that is technically competent and aesthetically pleasing, but emotionally thin.

Not always. But often enough to matter.


A Simple Conceptual Model

We might think of artistic impact as shaped by four interacting factors:

  • Resources: money, time, tools, education, networks
  • Confrontation: direct engagement with reality and limitation
  • Integration: the ability to process and transform experience
  • Scale: reach and visibility

Depth emerges primarily from confrontation combined with integration. Scale emerges primarily from resources combined with gatekeeping.

These are not the same axis.

This is why depth and visibility so often fail to correlate.


The Availability Bias We Rarely Acknowledge

Here is the point that quietly undermines almost all self comparison.

We cannot imagine the full extent of deep, meaningful art that exists in the world, because most of it is unseen.

We only ever encounter work that has passed through multiple filters: economic viability, algorithmic compatibility, cultural timing, geography, survivorship. What we perceive as “the best” is simply what survived these filters, not what was most profound.

This creates a massive availability bias. We judge our own work against a tiny, distorted sample set, unaware of the vast submerged mass beneath it.

Statistically, it is almost certain that some of the most meaningful art ever created was never widely seen, never validated, and never preserved. Not because it lacked value, but because visibility is not awarded for depth.


The Grounding Thought

When an unseen artist compares themselves to a famous one, they are not comparing like with like.

They are comparing their entire internal reality to someone else’s externally amplified residue.

Visibility is not evidence of superiority. In many cases, it is evidence of compatibility with a system that does not optimise for meaning.

This does not mean recognition is worthless. It does mean it is a poor proxy for depth.

If you are creating work that feels honest, integrated, and necessary to you, then something real is happening, regardless of scale. You are not failing a fair test. You are simply operating outside a system that was never designed to surface what you value.

And there are far more people in that position than you will ever be able to see.

The Move to Self-Hosting

A laptop open on a wooden desk beside a notebook, pen, coffee mug, and potted plants, lit by soft daylight from a window.

Why I Moved to a Self-Hosted Site

This site represents a quiet but deliberate change.

After spending a long time publishing on a WordPress hosted site, I have moved An Alternative Perspective to a self-hosted home. This was not a rage quit, and it was not driven by trends, growth hacks, or monetisation plans. It was a values driven decision, one that became increasingly obvious the more I paid attention to how I actually use the internet, and how I want my writing to exist within it.

This post is a short explanation of why I made the move, what it means for me, what it means for you as a reader, and what I have noticed since making the change.


Ownership, Not “Access”

On hosted platforms, you are never really publishing your site. You are borrowing space inside someone else’s system, governed by rules, incentives, and design choices that are not yours.

Self-hosting changes that relationship.

This site is now:

  • Fully under my control
  • Structurally simple
  • Free from injected ads, algorithmic nudges, and platform level priorities

That does not mean it is perfect. It means it is honest. The site exists to host writing, nothing more and nothing less.


Benefits for Readers (Not Just for Me)

While this move benefits me technically and philosophically, it also improves the experience for anyone reading here.

A calmer reading environment

No pop ups. No recommended content traps. No attention funnels.
Just the article you chose to read.

Consistency and stability

Posts will not disappear due to policy changes, account flags, or shifting platform priorities. Links will remain valid. Archives will stay intact.

Clear intent

You are not being profiled, nudged, or measured for engagement value. You are here because you chose to be, and that matters.


The Joy of Building Something Slowly

One unexpected benefit of this move has been how enjoyable the process was.

Setting this site up reminded me of an earlier relationship with technology, one based on curiosity, tinkering, and understanding how things actually work, rather than clicking through opaque interfaces.

Adjusting layouts, trimming excess, and learning what I actually needed, and what I did not, felt grounding rather than stressful. Each decision had a clear purpose. Each change had a visible effect.

It felt like building a place, not configuring a product.


What I Have Noticed Since the Move

A few things became immediately clear:

  • The site feels calmer
  • The writing feels more intentional
  • I feel less pressure to perform
  • The structure encourages depth over speed

There is also a psychological shift. When you remove metrics, prompts, and algorithmic framing, you are left with a simple question:

“Is this worth writing?”

That is a good question to be left alone with.


What This Site Is (and Is Not)

This site is not optimised for virality.
It is not designed to chase trends.
It is not interested in growth for growth’s sake.

It is a space for:

  • Slow thinking
  • Systemic critique
  • Personal reflection
  • Writing that does not fit neatly into platforms

If you are reading this, you are already the intended audience.


A Quiet Commitment

Moving to self-hosting is not a rejection of the wider internet. It is a commitment to using it deliberately.

This site will continue to evolve, but its core purpose is fixed, to host writing with clarity, autonomy, and respect for the reader.

Thank you for being here, and for choosing to read, rather than scroll.

From Golden Age to Dark Age

A wooden desk displaying a mix of vintage and modern technology, including an old CRT computer, floppy disks, and a rotary phone alongside a laptop, smartphone, tablet, and smartwatch, all lit by soft natural light.

Reframing the “Dark Ages” of Computing

Early eras of personal computing are often described as primitive or limited. Slow processors, minimal memory, long loading times, and fragile storage media are usually framed as obstacles that progress needed to overcome. From this perspective, those years are treated as a kind of dark age that modern technology has thankfully moved beyond.

But that framing depends on what we choose to measure.

If progress is judged only by speed, capacity, and convenience, then early computing does indeed look crude. If it is judged by clarity of purpose, user agency, and the quality of attention it encouraged, the picture looks very different.

In many important ways, that period functioned as a golden age. Not because the machines were powerful, but because the relationship between people and computers was simpler, more honest, and more intentional. The limitations were visible and understood. Nothing pretended to be effortless or invisible.

This is not an argument that technology should have remained static. It is an attempt to notice what was present then that has since faded. What values were embedded in those systems by necessity, and what was quietly lost as those constraints disappeared.

Reframing that era is not about glorifying the past. It is about recognising that some forms of progress replace one set of problems with another, and that not all losses are immediately obvious when gains arrive quickly.


When Computing Required Commitment

I grew up in the 1980s, and my earliest experiences of computing were shaped by machines such as the Sinclair ZX Spectrum and the Commodore 64. Later came the Sam Coupé, and eventually, disk based systems like the Commodore Amiga and a 486 PC. These were not just different machines. They embodied a very different relationship with software.

Before hard drives and persistent storage became standard, using a computer meant beginning from nothing each time. Software was loaded directly into memory, and once the power was turned off, it vanished.

Every session started with a decision.

For tape based systems in particular, loading software was a commitment. You pressed play on a cassette recorder, watched the screen, and waited. Sometimes it worked. Sometimes it did not. Either way, the act demanded patience and intention. You did not casually load something just to see what it was like.

That waiting time mattered. It created a clear boundary between curiosity and action. You chose one thing and settled into it. Once loading began, you had already invested enough time that abandoning it halfway felt wasteful.

Scarcity reinforced this focus. Limited memory and storage meant fewer options, but those limits encouraged depth rather than frustration. You stayed with what you had chosen, explored it thoroughly, and made the most of the time and effort already spent.

These constraints were not designed to cultivate attention, but they did so naturally. Commitment was embedded in the process. Using software meant deciding, waiting, and then being present with the result.


Ephemerality and Presence

The impermanence of early computing did more than limit convenience. It defined the boundaries of the relationship between user and machine.

When software existed only for the duration of a session, it made no claim beyond that moment. The computer did not linger in the background. It did not accumulate expectations, track behaviour, or demand return. When the session ended, the relationship ended with it.

This created a sense of containment that is largely absent today. Computing happened in discrete blocks of time. There was a beginning, a middle, and an end. Attention could fully enter the activity, knowing it would also be allowed to fully leave.

Because nothing persisted by default, presence came naturally. The machine waited. It responded when asked, and remained silent when it was not. Engagement was shaped by intention rather than interruption.

This ephemerality placed the user firmly in control of the terms of engagement. The computer existed when invited, and disappeared when dismissed. That simple boundary made focus easier, not because users were more disciplined, but because the system itself respected closure.


The Shift Toward Extraction

As computing moved toward persistent storage, constant connectivity, and integrated platforms, the relationship between users and machines began to change. Software no longer existed only within the boundaries of a session. It stayed. It remembered. It updated itself. It waited in the background.

At first, this persistence was a genuine improvement. Work could be saved. Progress could be resumed. Systems became more capable and flexible. But alongside these gains came a subtle shift in expectations.

Software began to assume continuity. Applications no longer waited quietly to be used. They checked in. They notified. They synchronised. Over time, they developed a presence even when they were not actively engaged.

With connectivity came new forms of value extraction. Attention, behaviour, and usage patterns became measurable and profitable. The user was no longer just someone operating a machine, but a source of data and engagement.

This did not happen all at once, and it was rarely framed as exploitation. It arrived under the language of convenience, personalisation, and improvement. But the effect was cumulative. The computer stopped being a bounded tool and started becoming an environment that made ongoing claims on its user.


Why This Is Not Nostalgia

It is easy to dismiss reflections like this as nostalgia. A longing for simpler machines, slower systems, or a time when technology felt more manageable. But nostalgia implies a desire to return, and that is not what this is about.

The limitations of early computing were real. Systems were fragile, slow, and often frustrating. Modern technology has solved many genuine problems, and few people would seriously argue that we should abandon those gains.

The point is not that the past was better in every way. It is that progress is not purely additive. Every gain introduces trade offs, and some of those trade offs are cultural rather than technical.

Alongside shifts in agency and attention, there has also been a quiet move from ownership to access. Software that was once purchased, possessed, and used on clearly defined terms is increasingly something we are granted access to under conditions that can change. This is not inherently wrong, but it alters the balance of power, and it reshapes the relationship between user and tool.

What has been lost is not processing power or convenience, but clarity of relationship. Earlier systems made their limits obvious. Modern systems often hide theirs. Control, persistence, and extraction are woven into the background rather than presented as choices.

Noticing this loss is not resistance to change. It is awareness of cost. It comes from having lived through the transition, and being able to compare what was gained with what quietly disappeared along the way.

Recognising that difference is not about rejecting modern technology. It is about refusing to pretend that nothing was lost when everything became easier.


What the “Dark Age” Really Is

If the early years of personal computing could be seen as a dark age, it is worth asking what that phrase really means. Darkness is not simply a lack of capability. It is a lack of clarity, agency, or visibility.

By that measure, the present moment fits the term more closely than the past.

Modern technology is powerful, fast, and seamless, but it often operates opaquely. Systems persist in the background. Decisions are made automatically. Data is collected continuously. Attention is assumed rather than requested. Much of this happens without clear boundaries or explicit consent.

The darkness lies in this diffusion. When tools no longer have clear edges, it becomes harder to see where responsibility sits, or where choice begins and ends. Convenience smooths over these questions, making it easy to forget that trade offs are being made on our behalf.

In earlier systems, limits were visible. You knew when a program was running. You knew when it stopped. You knew what belonged to you and what did not. Today, those lines are blurred, not because technology demands it, but because blurring them benefits extractive models.

The true dark age is not defined by slow machines or limited memory. It is defined by the quiet erosion of sovereignty, where users are present inside systems that no longer clearly serve them.


Carrying the Memory Forward

Remembering earlier eras of computing is not about trying to return to them. Those systems belonged to their time, shaped by constraints that no longer exist in the same way. What matters is not the hardware, but the values those constraints made visible.

Lived experience of that period provides a reference point. It shows that computing can exist without constant extraction, without persistent surveillance, and without assuming entitlement to attention. It reminds us that tools can wait, that sessions can end cleanly, and that users can remain clearly in control of the terms of engagement.

Carrying that memory forward allows for discernment. It helps separate what is genuinely beneficial from what is merely convenient. It creates space to question whether newer systems are serving human needs, or quietly reshaping them.

This is not a call to reject modern technology, but to use it consciously. To value boundaries, intentionality, and clarity in our relationship with machines. The past does not need to be recreated for its lessons to matter.

The golden age was never about limitation. It was about respect. Remembering that makes it possible to imagine a future where technology empowers rather than extracts.

Are You Paying Attention?

A person sitting alone in a dimly lit room looks down at a glowing smartphone, while colourful digital notification icons float around them, illuminating the space.

A Phrase We Rarely Question

“Pay attention” is a phrase we use without thinking. It is something teachers say to students, something parents say to children, and something we say to ourselves when we feel distracted. The words are so familiar that their meaning feels fixed and harmless.

And yet, the phrase is quietly revealing.

To pay attention is to spend something. It implies cost, value, and limitation. We understand intuitively that attention is not infinite, and that directing it toward one thing means withdrawing it from another. Even before the digital age, the language reflected this truth.

What has changed is not the phrase, but its accuracy.

In an environment where attention is actively competed for, measured, and monetised, “pay attention” has become less of a metaphor and more of a description. Attention now functions as a form of currency, exchanged constantly and often without conscious agreement.

This raises an uncomfortable question. If attention is something we are always paying, who or what is collecting it, and at what cost?


Attention Is How Meaning Forms

Attention is not just about noticing things. It is the mechanism through which meaning forms. What we attend to, and how long we attend to it, shapes what we consider valuable, significant, or true.

Meaning does not emerge instantly. It requires continuity. Staying with a thought, an idea, a piece of work, or another person long enough for understanding to develop. Without that sustained focus, experiences remain shallow and easily replaced.

This is why distraction is not neutral. When attention is repeatedly broken, meaning does not have the chance to consolidate. Things are seen, but not held. Information is encountered, but not integrated.

In this sense, attention functions as a kind of soil. It is the environment in which ideas take root and grow. When that soil is constantly disturbed, depth becomes difficult regardless of intention or intelligence.

If attention is fragmented by default, then the difficulty many people experience in finding meaning is not a personal failure. It is a consequence of the conditions in which attention now exists.


From Attention Given to Attention Taken

For much of human history, attention was something we chose to give. It was directed by interest, necessity, or care. While it could be influenced, it was not systematically harvested.

That balance has shifted.

Many modern systems are designed not merely to present information, but to capture and hold attention for as long as possible. Success is measured in engagement, time spent, and frequency of return. In this environment, attention is no longer a byproduct of value. It is the primary resource being extracted.

This changes the relationship between people and the things they interact with. Attention is assumed rather than requested. Notifications, alerts, and prompts arrive uninvited, each claiming urgency regardless of importance.

What was once offered becomes taken. The default state is no longer calm availability, but constant interruption. Attention is pulled outward, again and again, often before a person has decided where it should go.

This shift is subtle because it rarely feels coercive. Instead, it feels like participation. But when attention is engineered to be captured, the line between choice and compliance begins to blur.


Fragmentation by Design

The constant fragmentation of attention is often treated as an unfortunate side effect of modern life. Something to be managed through personal discipline, productivity tools, or better habits. But much of this fragmentation is not accidental.

Systems that rely on attention as a resource benefit from keeping it mobile. Short bursts of focus are easier to redirect than sustained concentration. Interruptions reset the mental field, making it easier to introduce the next prompt, the next suggestion, the next claim on attention.

In this environment, depth becomes inefficient. Lingering too long on one thing reduces exposure to others. From the perspective of extractive systems, sustained focus is not a virtue, but a liability.

As a result, experiences are shaped to encourage frequent checking rather than prolonged engagement. Notifications interrupt thought. Feeds refresh endlessly. Tasks are broken into fragments that never quite resolve. The mind is kept busy, but rarely settled.

This has consequences. When attention is constantly divided, it becomes harder to remain present with anything long enough for it to matter. Not because people are incapable of focus, but because the surrounding conditions are actively working against it.

Fragmentation is not a personal failing. It is an environment carefully tuned to prevent attention from ever fully coming to rest.


Paying With the Faculty That Creates Meaning

There is a deeper cost to attention becoming a currency.

We are paying with the very faculty that allows us to decide what matters in the first place. Attention is not just something we give to meaningful things. It is how meaning is recognised and sustained at all.

When attention is continuously diverted or depleted, the ability to form clear values weakens. Everything competes at the same level. Importance becomes harder to distinguish from noise, not because nothing matters, but because nothing is allowed to matter for long.

This produces a particular kind of exhaustion. Not simple tiredness, but the fatigue of constant partial engagement. The sense of being busy without being fulfilled.

The tragedy is not that attention is spent, but that it is spent so freely and so constantly that it undermines the very process by which meaning forms.


Choosing Where Attention Goes

Despite all of this, attention is not lost entirely. It is pressured, competed for, and frequently diverted, but it can still be reclaimed.

Choosing where attention goes is one of the few remaining acts of agency that does not require permission. It does not depend on platforms, products, or approval. It begins with noticing what is asking for attention, and deciding whether it deserves it.

This does not require total withdrawal or rigid control. It requires selectivity. Allowing fewer things in. Letting some moments remain uninterrupted. Creating spaces where attention can settle rather than scatter.

In this sense, paying attention can become a conscious act again. Not a reflexive response to prompts and demands, but a deliberate investment. A way of saying that some things are worth time, presence, and care, and others are not.

Attention is finite. Where it goes shapes what grows. Treating it as valuable is not a luxury. It is a necessity for meaning in an environment designed to dissolve it.