There is an assumption so deeply embedded in modern culture that it rarely gets questioned: that artistic success is proportional to visibility. If something is widely known, widely shared, or widely rewarded, we assume it must also be more accomplished, more skilled, or more meaningful.
At first glance, this seems reasonable. After all, how else are we meant to judge achievement?
But this assumption begins to unravel when we look more closely at what actually enables visibility.
Scale, Not Merit, Is The Great Divider
One of the biggest differences between globally recognised artists and those quietly working in obscurity is not necessarily talent, depth, or sincerity. It is scale.
Scale is enabled by money, time, infrastructure, and connections. These things allow work to be polished more quickly, distributed more widely, and sustained more reliably. They also allow technical skill to be developed faster, through access to better tools, education, mentorship, and collaborative environments.
This does not mean famous artists lack merit. Many clearly do not. But it does mean that merit is not the sole or even primary determinant of who gets seen.
Once scale is introduced, a feedback loop emerges. Money enables visibility. Visibility attracts more money. Momentum builds. At a certain point, success becomes self sustaining.
From this perspective, it becomes clear that some people cross a threshold where recognition is no longer uncertain. They are not guaranteed depth or meaning, but they are largely guaranteed presence.
Money As Both Metric And Mechanism
There is a deeper structural problem here.
Money is treated as a measure of success, but it is also the main tool used to manufacture success. This makes it a self influencing metric. The more of it you have, the more power you have to generate the appearance of success, which then justifies the metric itself.
In such a system, it is entirely possible for people to be effectively born successful, regardless of personal merit. Inherited wealth does not automatically produce shallow art, but it does remove many of the pressures that force confrontation with reality.
This matters, because confrontation often shapes meaning.
Struggle, Confrontation, And Depth
There is a reason so much resonant art emerges from difficulty. Struggle forces engagement with limits, loss, uncertainty, exclusion, and mortality. These experiences strip away abstraction and demand honesty.
That said, suffering alone does not create depth. What matters is integration. The ability to process experience consciously and transform it into form.
Some people suffer and are crushed by it. Others avoid it. Some repeat inherited patterns without reflection. Depth arises not from pain itself, but from how pain is metabolised.
Comfort, on the other hand, can insulate. It can make confrontation optional. This often leads to work that is technically competent and aesthetically pleasing, but emotionally thin.
Not always. But often enough to matter.
A Simple Conceptual Model
We might think of artistic impact as shaped by four interacting factors:
Resources: money, time, tools, education, networks
Confrontation: direct engagement with reality and limitation
Integration: the ability to process and transform experience
Scale: reach and visibility
Depth emerges primarily from confrontation combined with integration. Scale emerges primarily from resources combined with gatekeeping.
These are not the same axis.
This is why depth and visibility so often fail to correlate.
The Availability Bias We Rarely Acknowledge
Here is the point that quietly undermines almost all self comparison.
We cannot imagine the full extent of deep, meaningful art that exists in the world, because most of it is unseen.
We only ever encounter work that has passed through multiple filters: economic viability, algorithmic compatibility, cultural timing, geography, survivorship. What we perceive as “the best” is simply what survived these filters, not what was most profound.
This creates a massive availability bias. We judge our own work against a tiny, distorted sample set, unaware of the vast submerged mass beneath it.
Statistically, it is almost certain that some of the most meaningful art ever created was never widely seen, never validated, and never preserved. Not because it lacked value, but because visibility is not awarded for depth.
The Grounding Thought
When an unseen artist compares themselves to a famous one, they are not comparing like with like.
They are comparing their entire internal reality to someone else’s externally amplified residue.
Visibility is not evidence of superiority. In many cases, it is evidence of compatibility with a system that does not optimise for meaning.
This does not mean recognition is worthless. It does mean it is a poor proxy for depth.
If you are creating work that feels honest, integrated, and necessary to you, then something real is happening, regardless of scale. You are not failing a fair test. You are simply operating outside a system that was never designed to surface what you value.
And there are far more people in that position than you will ever be able to see.
This site represents a quiet but deliberate change.
After spending a long time publishing on a WordPress hosted site, I have moved An Alternative Perspective to a self-hosted home. This was not a rage quit, and it was not driven by trends, growth hacks, or monetisation plans. It was a values driven decision, one that became increasingly obvious the more I paid attention to how I actually use the internet, and how I want my writing to exist within it.
This post is a short explanation of why I made the move, what it means for me, what it means for you as a reader, and what I have noticed since making the change.
Ownership, Not “Access”
On hosted platforms, you are never really publishing your site. You are borrowing space inside someone else’s system, governed by rules, incentives, and design choices that are not yours.
Self-hosting changes that relationship.
This site is now:
Fully under my control
Structurally simple
Free from injected ads, algorithmic nudges, and platform level priorities
That does not mean it is perfect. It means it is honest. The site exists to host writing, nothing more and nothing less.
Benefits for Readers (Not Just for Me)
While this move benefits me technically and philosophically, it also improves the experience for anyone reading here.
A calmer reading environment
No pop ups. No recommended content traps. No attention funnels. Just the article you chose to read.
Consistency and stability
Posts will not disappear due to policy changes, account flags, or shifting platform priorities. Links will remain valid. Archives will stay intact.
Clear intent
You are not being profiled, nudged, or measured for engagement value. You are here because you chose to be, and that matters.
The Joy of Building Something Slowly
One unexpected benefit of this move has been how enjoyable the process was.
Setting this site up reminded me of an earlier relationship with technology, one based on curiosity, tinkering, and understanding how things actually work, rather than clicking through opaque interfaces.
Adjusting layouts, trimming excess, and learning what I actually needed, and what I did not, felt grounding rather than stressful. Each decision had a clear purpose. Each change had a visible effect.
It felt like building a place, not configuring a product.
What I Have Noticed Since the Move
A few things became immediately clear:
The site feels calmer
The writing feels more intentional
I feel less pressure to perform
The structure encourages depth over speed
There is also a psychological shift. When you remove metrics, prompts, and algorithmic framing, you are left with a simple question:
“Is this worth writing?”
That is a good question to be left alone with.
What This Site Is (and Is Not)
This site is not optimised for virality. It is not designed to chase trends. It is not interested in growth for growth’s sake.
It is a space for:
Slow thinking
Systemic critique
Personal reflection
Writing that does not fit neatly into platforms
If you are reading this, you are already the intended audience.
A Quiet Commitment
Moving to self-hosting is not a rejection of the wider internet. It is a commitment to using it deliberately.
This site will continue to evolve, but its core purpose is fixed, to host writing with clarity, autonomy, and respect for the reader.
Thank you for being here, and for choosing to read, rather than scroll.
Early eras of personal computing are often described as primitive or limited. Slow processors, minimal memory, long loading times, and fragile storage media are usually framed as obstacles that progress needed to overcome. From this perspective, those years are treated as a kind of dark age that modern technology has thankfully moved beyond.
But that framing depends on what we choose to measure.
If progress is judged only by speed, capacity, and convenience, then early computing does indeed look crude. If it is judged by clarity of purpose, user agency, and the quality of attention it encouraged, the picture looks very different.
In many important ways, that period functioned as a golden age. Not because the machines were powerful, but because the relationship between people and computers was simpler, more honest, and more intentional. The limitations were visible and understood. Nothing pretended to be effortless or invisible.
This is not an argument that technology should have remained static. It is an attempt to notice what was present then that has since faded. What values were embedded in those systems by necessity, and what was quietly lost as those constraints disappeared.
Reframing that era is not about glorifying the past. It is about recognising that some forms of progress replace one set of problems with another, and that not all losses are immediately obvious when gains arrive quickly.
When Computing Required Commitment
I grew up in the 1980s, and my earliest experiences of computing were shaped by machines such as the Sinclair ZX Spectrum and the Commodore 64. Later came the Sam Coupé, and eventually, disk based systems like the Commodore Amiga and a 486 PC. These were not just different machines. They embodied a very different relationship with software.
Before hard drives and persistent storage became standard, using a computer meant beginning from nothing each time. Software was loaded directly into memory, and once the power was turned off, it vanished.
Every session started with a decision.
For tape based systems in particular, loading software was a commitment. You pressed play on a cassette recorder, watched the screen, and waited. Sometimes it worked. Sometimes it did not. Either way, the act demanded patience and intention. You did not casually load something just to see what it was like.
That waiting time mattered. It created a clear boundary between curiosity and action. You chose one thing and settled into it. Once loading began, you had already invested enough time that abandoning it halfway felt wasteful.
Scarcity reinforced this focus. Limited memory and storage meant fewer options, but those limits encouraged depth rather than frustration. You stayed with what you had chosen, explored it thoroughly, and made the most of the time and effort already spent.
These constraints were not designed to cultivate attention, but they did so naturally. Commitment was embedded in the process. Using software meant deciding, waiting, and then being present with the result.
Ephemerality and Presence
The impermanence of early computing did more than limit convenience. It defined the boundaries of the relationship between user and machine.
When software existed only for the duration of a session, it made no claim beyond that moment. The computer did not linger in the background. It did not accumulate expectations, track behaviour, or demand return. When the session ended, the relationship ended with it.
This created a sense of containment that is largely absent today. Computing happened in discrete blocks of time. There was a beginning, a middle, and an end. Attention could fully enter the activity, knowing it would also be allowed to fully leave.
Because nothing persisted by default, presence came naturally. The machine waited. It responded when asked, and remained silent when it was not. Engagement was shaped by intention rather than interruption.
This ephemerality placed the user firmly in control of the terms of engagement. The computer existed when invited, and disappeared when dismissed. That simple boundary made focus easier, not because users were more disciplined, but because the system itself respected closure.
The Shift Toward Extraction
As computing moved toward persistent storage, constant connectivity, and integrated platforms, the relationship between users and machines began to change. Software no longer existed only within the boundaries of a session. It stayed. It remembered. It updated itself. It waited in the background.
At first, this persistence was a genuine improvement. Work could be saved. Progress could be resumed. Systems became more capable and flexible. But alongside these gains came a subtle shift in expectations.
Software began to assume continuity. Applications no longer waited quietly to be used. They checked in. They notified. They synchronised. Over time, they developed a presence even when they were not actively engaged.
With connectivity came new forms of value extraction. Attention, behaviour, and usage patterns became measurable and profitable. The user was no longer just someone operating a machine, but a source of data and engagement.
This did not happen all at once, and it was rarely framed as exploitation. It arrived under the language of convenience, personalisation, and improvement. But the effect was cumulative. The computer stopped being a bounded tool and started becoming an environment that made ongoing claims on its user.
Why This Is Not Nostalgia
It is easy to dismiss reflections like this as nostalgia. A longing for simpler machines, slower systems, or a time when technology felt more manageable. But nostalgia implies a desire to return, and that is not what this is about.
The limitations of early computing were real. Systems were fragile, slow, and often frustrating. Modern technology has solved many genuine problems, and few people would seriously argue that we should abandon those gains.
The point is not that the past was better in every way. It is that progress is not purely additive. Every gain introduces trade offs, and some of those trade offs are cultural rather than technical.
Alongside shifts in agency and attention, there has also been a quiet move from ownership to access. Software that was once purchased, possessed, and used on clearly defined terms is increasingly something we are granted access to under conditions that can change. This is not inherently wrong, but it alters the balance of power, and it reshapes the relationship between user and tool.
What has been lost is not processing power or convenience, but clarity of relationship. Earlier systems made their limits obvious. Modern systems often hide theirs. Control, persistence, and extraction are woven into the background rather than presented as choices.
Noticing this loss is not resistance to change. It is awareness of cost. It comes from having lived through the transition, and being able to compare what was gained with what quietly disappeared along the way.
Recognising that difference is not about rejecting modern technology. It is about refusing to pretend that nothing was lost when everything became easier.
What the “Dark Age” Really Is
If the early years of personal computing could be seen as a dark age, it is worth asking what that phrase really means. Darkness is not simply a lack of capability. It is a lack of clarity, agency, or visibility.
By that measure, the present moment fits the term more closely than the past.
Modern technology is powerful, fast, and seamless, but it often operates opaquely. Systems persist in the background. Decisions are made automatically. Data is collected continuously. Attention is assumed rather than requested. Much of this happens without clear boundaries or explicit consent.
The darkness lies in this diffusion. When tools no longer have clear edges, it becomes harder to see where responsibility sits, or where choice begins and ends. Convenience smooths over these questions, making it easy to forget that trade offs are being made on our behalf.
In earlier systems, limits were visible. You knew when a program was running. You knew when it stopped. You knew what belonged to you and what did not. Today, those lines are blurred, not because technology demands it, but because blurring them benefits extractive models.
The true dark age is not defined by slow machines or limited memory. It is defined by the quiet erosion of sovereignty, where users are present inside systems that no longer clearly serve them.
Carrying the Memory Forward
Remembering earlier eras of computing is not about trying to return to them. Those systems belonged to their time, shaped by constraints that no longer exist in the same way. What matters is not the hardware, but the values those constraints made visible.
Lived experience of that period provides a reference point. It shows that computing can exist without constant extraction, without persistent surveillance, and without assuming entitlement to attention. It reminds us that tools can wait, that sessions can end cleanly, and that users can remain clearly in control of the terms of engagement.
Carrying that memory forward allows for discernment. It helps separate what is genuinely beneficial from what is merely convenient. It creates space to question whether newer systems are serving human needs, or quietly reshaping them.
This is not a call to reject modern technology, but to use it consciously. To value boundaries, intentionality, and clarity in our relationship with machines. The past does not need to be recreated for its lessons to matter.
The golden age was never about limitation. It was about respect. Remembering that makes it possible to imagine a future where technology empowers rather than extracts.
“Pay attention” is a phrase we use without thinking. It is something teachers say to students, something parents say to children, and something we say to ourselves when we feel distracted. The words are so familiar that their meaning feels fixed and harmless.
And yet, the phrase is quietly revealing.
To pay attention is to spend something. It implies cost, value, and limitation. We understand intuitively that attention is not infinite, and that directing it toward one thing means withdrawing it from another. Even before the digital age, the language reflected this truth.
What has changed is not the phrase, but its accuracy.
In an environment where attention is actively competed for, measured, and monetised, “pay attention” has become less of a metaphor and more of a description. Attention now functions as a form of currency, exchanged constantly and often without conscious agreement.
This raises an uncomfortable question. If attention is something we are always paying, who or what is collecting it, and at what cost?
Attention Is How Meaning Forms
Attention is not just about noticing things. It is the mechanism through which meaning forms. What we attend to, and how long we attend to it, shapes what we consider valuable, significant, or true.
Meaning does not emerge instantly. It requires continuity. Staying with a thought, an idea, a piece of work, or another person long enough for understanding to develop. Without that sustained focus, experiences remain shallow and easily replaced.
This is why distraction is not neutral. When attention is repeatedly broken, meaning does not have the chance to consolidate. Things are seen, but not held. Information is encountered, but not integrated.
In this sense, attention functions as a kind of soil. It is the environment in which ideas take root and grow. When that soil is constantly disturbed, depth becomes difficult regardless of intention or intelligence.
If attention is fragmented by default, then the difficulty many people experience in finding meaning is not a personal failure. It is a consequence of the conditions in which attention now exists.
From Attention Given to Attention Taken
For much of human history, attention was something we chose to give. It was directed by interest, necessity, or care. While it could be influenced, it was not systematically harvested.
That balance has shifted.
Many modern systems are designed not merely to present information, but to capture and hold attention for as long as possible. Success is measured in engagement, time spent, and frequency of return. In this environment, attention is no longer a byproduct of value. It is the primary resource being extracted.
This changes the relationship between people and the things they interact with. Attention is assumed rather than requested. Notifications, alerts, and prompts arrive uninvited, each claiming urgency regardless of importance.
What was once offered becomes taken. The default state is no longer calm availability, but constant interruption. Attention is pulled outward, again and again, often before a person has decided where it should go.
This shift is subtle because it rarely feels coercive. Instead, it feels like participation. But when attention is engineered to be captured, the line between choice and compliance begins to blur.
Fragmentation by Design
The constant fragmentation of attention is often treated as an unfortunate side effect of modern life. Something to be managed through personal discipline, productivity tools, or better habits. But much of this fragmentation is not accidental.
Systems that rely on attention as a resource benefit from keeping it mobile. Short bursts of focus are easier to redirect than sustained concentration. Interruptions reset the mental field, making it easier to introduce the next prompt, the next suggestion, the next claim on attention.
In this environment, depth becomes inefficient. Lingering too long on one thing reduces exposure to others. From the perspective of extractive systems, sustained focus is not a virtue, but a liability.
As a result, experiences are shaped to encourage frequent checking rather than prolonged engagement. Notifications interrupt thought. Feeds refresh endlessly. Tasks are broken into fragments that never quite resolve. The mind is kept busy, but rarely settled.
This has consequences. When attention is constantly divided, it becomes harder to remain present with anything long enough for it to matter. Not because people are incapable of focus, but because the surrounding conditions are actively working against it.
Fragmentation is not a personal failing. It is an environment carefully tuned to prevent attention from ever fully coming to rest.
Paying With the Faculty That Creates Meaning
There is a deeper cost to attention becoming a currency.
We are paying with the very faculty that allows us to decide what matters in the first place. Attention is not just something we give to meaningful things. It is how meaning is recognised and sustained at all.
When attention is continuously diverted or depleted, the ability to form clear values weakens. Everything competes at the same level. Importance becomes harder to distinguish from noise, not because nothing matters, but because nothing is allowed to matter for long.
This produces a particular kind of exhaustion. Not simple tiredness, but the fatigue of constant partial engagement. The sense of being busy without being fulfilled.
The tragedy is not that attention is spent, but that it is spent so freely and so constantly that it undermines the very process by which meaning forms.
Choosing Where Attention Goes
Despite all of this, attention is not lost entirely. It is pressured, competed for, and frequently diverted, but it can still be reclaimed.
Choosing where attention goes is one of the few remaining acts of agency that does not require permission. It does not depend on platforms, products, or approval. It begins with noticing what is asking for attention, and deciding whether it deserves it.
This does not require total withdrawal or rigid control. It requires selectivity. Allowing fewer things in. Letting some moments remain uninterrupted. Creating spaces where attention can settle rather than scatter.
In this sense, paying attention can become a conscious act again. Not a reflexive response to prompts and demands, but a deliberate investment. A way of saying that some things are worth time, presence, and care, and others are not.
Attention is finite. Where it goes shapes what grows. Treating it as valuable is not a luxury. It is a necessity for meaning in an environment designed to dissolve it.
There is something about app stores that has never quite sat right with me. This is not a rejection of their usefulness. They are undeniably convenient, and in many cases they make installing software easier, safer, and more consistent. I use them myself.
And yet, over time, a quiet discomfort has surfaced.
It is not the obvious things that bother me. It is not the interface, or the concept of centralised updates, or even the idea of curated software in principle. The unease comes from something more subtle. A feeling that, somewhere along the way, installing software stopped being a deliberate act and became a passive one.
App stores feel less like places you go with a clear intention, and more like environments you exist within. Software presents itself whether you asked for it or not. Recommendations, rankings, and featured listings gently shift the focus away from what you set out to do, and toward what is being offered to you.
This raises a simple but important question. When installing something becomes effortless and ever present, what happens to the intentionality that once framed the act of choosing what we allow onto our machines, and by extension, into our lives?
Installing Software Used to Be an Intentional Act
There was a time when installing software required a clear decision. You did not install things casually or by accident. You identified a need or a curiosity, and then you went looking for something specific to address it.
This process involved effort. You might have purchased software on physical media, or downloaded it from a particular website after some consideration. Installation often took time. Sometimes it failed. Sometimes it conflicted with other software. None of this was especially elegant, but it created a natural pause between wanting something and acting on that desire.
That pause mattered.
It acted as a filter. You were less likely to install something unless you genuinely intended to use it. Software entered your system because you made room for it, both practically and mentally. The act of installation carried a sense of commitment.
In that context, software felt more like a tool than a presence. It existed to serve a specific purpose, and once that purpose was fulfilled, the relationship often ended. There was no expectation of ongoing engagement beyond the task at hand.
What has been lost is not simply inconvenience, but deliberation. Installing software used to be an extension of intentional choice. It reflected a moment where you decided what you needed, and acted accordingly.
App Stores and the Shift to Ambient Consumption
App stores changed more than the mechanics of installing software. They changed the context in which discovery happens. Searching for a specific solution has gradually been replaced by browsing within a curated environment.
Instead of seeking out software to meet an identified need, users are encouraged to explore what is available. Lists of popular apps, featured selections, recommendations, and rankings all invite a different mode of engagement. Software becomes something you encounter rather than something you deliberately seek.
This shift may seem minor, but its effects are significant. Browsing encourages openness, distraction, and impulse. Searching encourages focus, intention, and clarity. When browsing becomes the default, the question subtly changes from “What do I need?” to “What is being presented to me?”
Over time, this alters the relationship between the user and their tools. Installing software begins to resemble consumption rather than selection. The act becomes lighter, quicker, and less considered. The barrier to entry is lowered, but so is the sense of purpose.
In this environment, software no longer waits to be chosen. It competes for attention. Visibility becomes as important as usefulness, and sometimes more so. What rises to the surface is not always what is most appropriate, but what is most effectively positioned.
This is the point at which installation stops feeling like a conscious decision and starts to feel like a background activity. Something that happens alongside everything else, rather than as a result of a clear intention.
Curation Is Not Neutral
App stores often present themselves as neutral organisers. They appear to simply sort, categorise, and make software easier to find. In practice, curation is never passive. Decisions are being made about what is visible, what is promoted, and what is quietly pushed to the margins.
When software discovery is centralised, visibility becomes a form of power. Apps that align with the priorities of the platform are more likely to be surfaced. Those that do not may still exist, but they become harder to encounter without already knowing what you are looking for.
This has subtle but far reaching consequences. Software that is useful, thoughtful, or deliberately minimal does not always thrive in environments that reward engagement, monetisation, or scale. Meanwhile, applications designed to maximise retention or data collection are often better suited to the metrics that determine prominence.
The result is not overt censorship, but quiet shaping. Users are not told what they cannot install. Instead, they are guided toward what is most visible, most approved, or most easily integrated into the platform’s broader ecosystem.
Over time, this shapes expectations. Certain kinds of software begin to feel normal, while others feel obscure or fringe. The app store does not simply reflect demand. It actively participates in creating it.
From Tools to Ongoing Relationships
Installing software used to be a largely finite transaction. You acquired a tool, used it for a specific purpose, and moved on when that purpose was fulfilled. The relationship was clear and limited.
Many modern apps operate differently. Installation is no longer the end of the exchange, but the beginning of an ongoing relationship. Even software that appears simple often arrives with expectations attached. Requests for permissions, invitations to create accounts, prompts to enable notifications, and background activity are now common.
This creates a shift in assumptions. Software does not simply wait to be used. It checks in. It reminds. It nudges. It asks for continued attention, even when its original value has already been extracted.
What is striking is that this expectation often persists regardless of relevance. An app does not need to remain useful in order to remain present. Even when it no longer serves a meaningful purpose, it may still request updates, data, or engagement.
This changes how software feels. It stops being a passive tool and starts to resemble a claimant on attention. The boundary between use and obligation becomes blurred. What was once an object you reached for now feels like something that reaches back.
Convenience as a Mask
The appeal of app stores is not imagined. They genuinely reduce friction. They simplify updates, improve security in many cases, and make software installation accessible to people who would otherwise find it intimidating. These benefits are real, and it would be dishonest to ignore them.
However, convenience also reshapes behaviour.
When installing software becomes effortless, deliberation quietly fades. The cost of trying something drops so low that there is little reason to pause. Installing an app feels reversible and inconsequential, even when it is not.
Over time, this changes how refusal is experienced. Saying no begins to feel like unnecessary friction rather than an active choice. Declining permissions, disabling notifications, or avoiding suggested installs can start to feel like resisting the system rather than simply exercising agency.
Convenience smooths over these tensions. It presents itself as kindness, while quietly encouraging compliance. The easier something is to accept, the more unusual it feels to decline it.
In this way, convenience does not remove pressure. It relocates it. The effort is no longer in installing software, but in maintaining boundaries around it.
Why Some Platforms Feel Less Invasive
Not all app stores provoke the same level of discomfort. In some ecosystems, they function as optional conveniences rather than unavoidable gateways. This difference is not primarily technical. It is cultural.
On most Linux distributions, app stores exist alongside many other accepted ways of installing software. Package managers, direct downloads, source builds, and repositories all coexist. The presence of an app store does not imply that software outside of it is unsafe or unofficial by default. Users are still assumed to have agency, curiosity, and responsibility.
In this context, the app store feels like a shortcut rather than a rulebook. It offers convenience without defining legitimacy. If you know what you want and prefer another route, the system does not quietly punish you for that choice.
By contrast, platforms such as Windows, macOS, iOS, and Android increasingly frame their app stores as the correct and responsible way to install software. While alternatives may still exist in some cases, they are often discouraged through warnings, additional friction, or limited functionality. Installing software outside of the approved channel is subtly framed as risky, outdated, or irresponsible.
On mobile platforms in particular, the app store is not just preferred, but enforced. Discovery, installation, updates, and monetisation are tightly bound to a single gatekeeper. This centralisation gives the platform significant influence over what software is visible, viable, and economically sustainable.
The result is a clear shift in assumption. On Linux, the app store supports agency. On more tightly controlled platforms, it replaces it. The difference is not about security or ease of use alone, but about who is ultimately trusted to decide what belongs on a user’s device.
What’s Really Being Eroded
The core issue with app stores is not the loss of freedom to install software. In most cases, that freedom still exists in some form. What has changed is more subtle, and more consequential.
What is being eroded is intentionality.
When discovery is managed, when installation is effortless, and when software assumes an ongoing relationship by default, the space for deliberate choice narrows. Decisions happen faster, with less reflection. Over time, this reshapes how users relate to their devices.
There is also a quieter loss of ownership over desire. When needs are anticipated and presented in advance, it becomes harder to tell whether an action originated from a genuine requirement or from exposure. The line between choosing and being guided begins to blur.
This erosion does not announce itself as restriction. It arrives as convenience, safety, and efficiency. Nothing is taken away outright. Instead, the conditions that once encouraged pause, discernment, and commitment slowly dissolve.
The result is a digital environment where fewer choices feel consciously made, even though options appear abundant. Software multiplies, but clarity diminishes. The system becomes rich in possibility and poor in meaning.
Not Anti Progress, Pro Agency
This is not an argument against app stores as a concept. They solve real problems and, when used thoughtfully, can genuinely improve the experience of managing software. The issue is not their existence, but the role they have come to play.
When a single channel shapes discovery, defines legitimacy, and normalises ongoing extraction of attention, it stops being a neutral convenience and starts to influence behaviour. What is lost in the process is not access, but agency.
Questioning this shift is not a rejection of progress. It is a refusal to treat all change as inherently positive. Progress that reduces friction but erodes intentionality comes with costs that are easy to overlook precisely because they arrive quietly.
Tools should serve clear purposes. They should enter our systems because we invite them in, not because they are placed in our path often enough to feel inevitable. Reclaiming that distinction matters.
Being more intentional about what we install is a small act, but it reflects a larger stance. One that values conscious choice over managed exposure, and agency over convenience.
There is a particular kind of problem that is difficult to talk about, not because it is rare or abstract, but because it has no name.
Most people recognise the feeling. Something feels wrong, heavy, or quietly hostile in a low-grade way. It is not catastrophic or dramatic, but it is persistent. When you try to explain it, you find yourself talking for too long, reaching for examples, qualifying your statements, and pre-empting dismissal. The explanation feels clumsy, disproportionate, or as if you are overthinking something that should be simple.
Often the response is some variation of:
“You are making it a problem.” “It is just how things are.” “Everyone deals with that.”
And slowly, quietly, the issue retreats back into silence.
This is the nameless problem.
When Experience Outpaces Language
Language does not arrive at the same time as experience. It lags behind it.
People often live with problems for years, sometimes generations, before the vocabulary exists to describe them cleanly. Until then, those problems tend to be minimised, normalised, personalised, or reframed as individual weakness.
Without language, there is no shared reference point. Each person is left to navigate the issue alone, carrying both the discomfort and the burden of explaining why it counts as a real problem.
This creates a strange inversion. The person who notices the problem is treated as the problem.
Why Unnamed Problems Persist
Unnamed problems are uniquely resilient.
They do not need to be defended, because they are rarely challenged directly. They hide in plain sight, diffused across systems, norms, interfaces, expectations, and the familiar phrase “just the way things work”.
When harm is ambient rather than acute, cumulative rather than singular, and structural rather than intentional, it becomes easy to deny, even when its effects are everywhere.
No villain is required. No conspiracy is needed. Only silence.
The Cost Of Not Having Words
When a problem cannot be named, it is usually internalised.
People begin to believe that they are too sensitive, bad at coping, or failing at something everyone else seems to manage without effort.
This is especially true for neurodivergent people, disabled people, and anyone whose nervous system or perception does not align neatly with the environments they are expected to tolerate.
Without language, distress becomes private. Private distress becomes shame. Shame keeps systems intact.
A Brief Historical Note
Many concepts we now take for granted were once dismissed as silly, exaggerated, or unnecessary.
There was a time before terms such as burnout, gaslighting, emotional labour, and sensory overload.
People still experienced these things, often intensely, but lacked the linguistic tools to make them legible to others.
The arrival of language did not create the problem. It revealed it.
Naming did not solve everything, but it changed the terrain. It allowed recognition to travel faster than explanation.
Naming Is Not Pedantry
There is a common suspicion that naming things is nitpicking, over-intellectualising, or making life harder than it needs to be.
In reality, naming is one of the simplest ways to reduce harm.
A word can shorten explanations, reduce self-doubt, allow shared recognition, interrupt dismissal, and make patterns visible.
Language does not have to be perfect to be useful. It only has to be good enough to hold the shape of the experience.
A Response To The Nameless Problem
Recognising the danger of unnamed problems naturally raises a question. If the absence of language allows harm to persist, what can be done about it?
One practical response is to create language deliberately.
As a way of addressing this problem, I have been working on a lexicon of terms relating to emergent issues of our era. These are not abstract theories or academic concepts. They are patterns that many people already feel and navigate, but often struggle to describe clearly or concisely.
The purpose of this work is not to dictate how people should think, but to reduce the effort required to recognise what is already happening.
The Lexicon
The Lexicon is a growing collection of terms intended to make certain classes of problems easier to see, name, and discuss.
Many of the entries describe patterns that are widely experienced yet rarely labelled. They tend to be normalised, quietly harmful, and difficult to articulate without shared language.
This project exists to shorten the distance between perception and articulation. It is a tool for recognition, not a manifesto or a claim to authority.
The Lexicon is not finished, and it is not closed.
Language evolves through use, refinement, disagreement, and care. If a term helps you recognise something you have struggled to explain, it has already done its job. If it does not, that is useful information too.
The most dangerous problems are often not the loudest ones. They are the ones we are trained not to name.
This project exists to make those problems speakable.
A microscope is a voyeuristic tool for microphiles.
This sentence sounds like a joke. It probably should be a joke. And yet, the longer I sit with it, the less comfortable I feel dismissing it as one.
At face value, it is just wordplay. A deliberately silly reframing of a serious scientific instrument through an absurd, slightly taboo lens. But there is something about it that refuses to let go. Something about the way it exposes an assumption we rarely examine.
A microscope allows us to look closely at things that have no concept of being seen.
That alone is not remarkable. We do this all the time. We look at insects, animals, people, screens, landscapes. Looking is so fundamental to how we exist that it barely registers as an action at all. It feels neutral. Passive. Harmless.
But a microscope changes the nature of looking. It is not casual. It is intentional. Focused. Curious in a way that borders on intimate. It does not simply show us what is there. It pulls a hidden world into view and places it beneath our gaze.
And once you notice that, it becomes harder to ignore the uncomfortable undertone. Looking is not always innocent. Sometimes it is a quiet assertion of power.
Voyeurism Without Shame: What Does It Mean to Look?
Voyeurism is a loaded word. It carries social and moral weight, mostly because it is so often tied to sexuality and violation. But if we strip it back to its most basic form, voyeurism is simply this:
The act of observing something without participating, and without the observed being aware.
When framed this way, voyeurism stops being an edge case and starts looking uncomfortably familiar.
Scientific observation relies on exactly this asymmetry. The observer knows. The observed does not. That imbalance is not a flaw in the system. It is the system.
There is a difference between seeing and looking. Seeing is passive. It happens whether we want it to or not. Looking is intentional. Looking is chosen. It is attention with direction and interest behind it.
A microscope does not let us merely see. It demands that we look.
Science prefers neutral language for this process. We say observation, analysis, study. These words are cleaner. More respectable. Voyeurism sounds indulgent. Suspicious. Unprofessional. It suggests desire where we would rather claim objectivity.
But curiosity is a form of desire. It is the desire to know, to witness, to understand without interference. And once we admit that, the boundary between observer and voyeur becomes less clear.
The discomfort does not come from the act of looking itself. It comes from recognising that looking is never as neutral as we pretend it is.
Scale Changes Everything
Ethics do not exist in a vacuum. They stretch and distort depending on where we are standing.
Scale changes everything.
At a human level, being watched without consent feels invasive. At a smaller scale, the idea barely makes sense. A bacterium has no concept of privacy. A cell does not experience exposure. The notion of being observed simply does not exist within its frame of reality.
When the scale gap is wide enough, the question of consent does not just go unanswered. It becomes undefined.
We do not ask permission before observing microbes. We do not feel guilt when watching ants navigate a pavement crack. Even with animals, the ethical boundary is fuzzy rather than fixed.
In fact, we routinely film animals in their most intimate moments. Birth, mating, injury, death. We capture it in high definition, add a calm voiceover, and broadcast it on national television as educational content.
This is not presented as intrusion. It is framed as insight.
The animals are unaware of the camera. They cannot object. They cannot even comprehend what is happening. From their perspective, nothing unusual has occurred at all.
And yet, if the same logic were applied at our own scale, it would be considered an extreme violation.
This is not because we are monsters. It is because power hides inside scale.
The microscope exaggerates this imbalance further. It places us so far above what we observe that we stop recognising the relationship as a relationship at all. The organism becomes an object. A process. A specimen.
And yet, from its own internal logic, it is alive. It is acting. It is existing.
The microscope does not create this imbalance. It reveals it.
The Microscope as a God Simulator
Looking through a microscope is a strangely godlike experience.
Within that tiny frame, you become omniscient. You see movements, structures, behaviours that are completely inaccessible from within that world. You witness life unfold in ways that are invisible to the life itself.
Nothing down there reacts to your presence. Nothing looks back.
This is not because you are hidden. It is because you are beyond relevance.
In that moment, the microscope functions as a kind of god simulator. It offers a glimpse of what it might feel like to exist at a scale where observation carries no reciprocal vulnerability. Where knowledge flows only one way.
The unsettling part is not that we do this. The unsettling part is how natural it feels.
We do not experience ourselves as voyeurs in this context. We experience ourselves as curious. As studious. As entitled to see. The language of science smooths over the power imbalance and replaces it with purpose.
But strip away the lab coat and the terminology, and something remains.
You are watching a world that does not know you exist. You are learning from it without its participation. You are extracting meaning without offering anything in return.
If that feels uncomfortable, it should.
Because once you recognise this dynamic, it becomes impossible not to ask a larger question.
If a microscope allows us to simulate godhood over the very small, what does that say about gods themselves?
The Eye as Symbol: Watching and Being Watched
At some point, I adopted the eye as part of my personal symbolism.
This was not an aesthetic choice so much as a quiet admission. The eye represents awareness, perception, and attention, but also burden. To see is not always a gift. Sometimes it is an obligation you cannot step away from.
The eye is not passive. It does not simply receive information. It selects. It focuses. It decides what matters. In that sense, it is both a tool and a responsibility.
But the eye carries another implication that is harder to ignore.
To take up the eye as a symbol is to align oneself with the role of observer. And the observer is never neutral.
There is an inherent imbalance in seeing without being seen. In knowing without being known. In understanding without participation. The eye places you slightly outside of what you observe, even when that observation is turned inward.
This raises an uncomfortable question. If I am drawn to the eye, am I claiming the role of witness, or confessing to it? Am I choosing awareness, or admitting that I cannot escape it?
Because the eye does not only watch outward. It watches the self.
Self awareness is a form of internal surveillance. It is the act of observing your own thoughts, behaviours, contradictions, and impulses as if they belong to something slightly separate. The mind becomes both subject and object. The watcher and the watched collapse into the same space.
To live as an observer is to accept a constant tension. You are never fully immersed, but never fully detached. You are present, but always looking.
And once you recognise that dynamic, it becomes difficult not to wonder whether it extends beyond you.
If I am watching, who else might be watching too?
The Uncomfortable Theology: Is an All Seeing God a Voyeur?
This is where the thought experiment starts to feel impolite.
If God is all seeing, then God sees everything. Not just the grand moments. Not just the moral tests or the significant turning points. Everything.
Private thoughts. Bodily functions. Grief. Shame. Intimacy. Loneliness. The unguarded moments no one intends to perform for an audience.
At a human scale, that level of observation would be considered extreme intrusion. Total surveillance. A complete collapse of privacy.
So the question arises, awkward but unavoidable.
If an all seeing being exists, does omniscience cross the line into voyeurism?
This is not an attack on faith. It is a question about power and perspective. Omniscience creates the ultimate asymmetry. One party knows everything. The other cannot opt out.
The usual defence is benevolence. God sees everything because God cares. God watches to protect, to judge fairly, to guide.
But this justification mirrors something very familiar. Surveillance framed as safety. Oversight reframed as responsibility.
The discomfort comes from recognising that intention does not erase imbalance.
An all seeing God is still a being that looks without consent. Not because consent is denied, but because consent is impossible. There is no mechanism by which the observed could meaningfully agree or disagree.
Perhaps omniscience is not immoral. Perhaps morality itself breaks down at that scale.
Just as microbes cannot accuse us of voyeurism, perhaps humans are too small to accuse a god. Not because the question is invalid, but because it cannot be processed from below.
At that point, theology starts to resemble microscopy again. A vast intelligence peering into a world that cannot look back. Watching life unfold, extracting meaning, never needing to explain itself.
If that comparison feels uncomfortable, it should.
Because the difference between god and scientist might not be morality. It might just be magnification.
Higher Dimensions and the Cosmic Fish Tank
Once you start thinking in terms of scale, it becomes difficult to stop at gods.
Physics already tells us that reality may have more dimensions than we can perceive. Not metaphorical dimensions, but literal ones. Axes of existence that do not intersect cleanly with our own sensory experience.
If such dimensions exist, then it follows that forms of observation might exist that we are fundamentally incapable of detecting.
From that perspective, our universe could be a cross section. A slice. A surface. Something being looked at from an angle we cannot comprehend.
The image that often comes to mind is an aquarium. Fish move through water unaware of the room beyond the glass. They experience their world as complete, even though it is embedded within a much larger one.
They are not being hidden from. They are simply not equipped to perceive the observer.
If something were observing us from a higher dimensional vantage point, we would not experience it as presence. We would experience it as absence. As coincidence. As randomness. As patterns that almost make sense.
And from that observer’s perspective, we might appear very small indeed.
Not physically small, but informationally small. Limited in scope. Predictable. Interesting in aggregate, but not individually negotiable. Something to watch rather than engage with.
This is where the microscope metaphor turns back on us.
If we accept that it is reasonable for humans to observe microbes without ethical collapse, then we must also accept the unsettling symmetry of the idea that we could be microbes in someone else’s frame of reference.
The same justifications would apply. We are unaware. We cannot consent. We cannot object. Not because we are being oppressed, but because the concept itself never crosses the boundary of relevance.
And perhaps that is the most disturbing thought of all.
Not that we are being watched, but that if we are, it may be happening in a way that does not even register as watching.
Consent at the Edges of Reality
By this point, consent has started to feel like a fragile concept.
At a human scale, consent is clear, meaningful, and ethically non negotiable. It depends on awareness, agency, and the ability to refuse. Without those conditions, consent collapses into coercion or fiction.
But as scale stretches, consent does not simply weaken. It stops functioning.
A bacterium cannot consent to observation because it cannot comprehend observation. An animal cannot consent to being filmed for the same reason we cannot consent to being observed by something we cannot perceive.
This does not make observation harmless. It makes it morally unstable.
Ethics rely on shared context. They assume a common frame of reference between observer and observed. Once that shared frame disappears, ethics stop offering answers and start producing discomfort instead.
We tend to resolve this discomfort by pretending the question does not apply. We say that observation at certain scales is neutral. Necessary. Educational. Inevitable.
And perhaps it is.
But that conclusion is less comforting than it first appears. Because it suggests that morality itself may be scale dependent. That what feels like violation from one vantage point feels like background noise from another.
This is not a call to abandon ethics. It is an acknowledgement of their limits.
If consent cannot exist across all scales, then neither can moral certainty. We are left instead with something messier. Responsibility without reciprocity. Awareness without permission. Power without clear guidance.
The microscope does not solve this problem. The eye does not solve it either.
They only make it visible.
The Cost of Seeing
At some point, this stops being about microscopes.
It becomes about the act of looking itself.
To see is not neutral. To look closely is never free. Awareness always extracts a cost, even when it feels passive. Especially when it feels passive.
The microscope reveals a hidden world, but it also reveals something about us. Our comfort with asymmetry. Our ease with unreciprocated observation. Our willingness to frame power as curiosity when it suits us.
The eye does the same.
To carry the eye as a symbol is to accept a burden. It means noticing things that would be easier not to notice. It means recognising imbalance without always having the ability to correct it. It means living with the discomfort of seeing both outward and inward, knowing that observation changes the relationship whether we acknowledge it or not.
Perhaps this is the real unease behind voyeurism. Not that we look, but that looking alters the world in subtle ways we cannot undo.
And perhaps this is why the idea of being watched unsettles us so deeply. Not because it would be cruel or perverse, but because it would mirror us back to ourselves. It would place us, finally, on the other side of the lens.
If awareness is a kind of power, then maybe the true ethical challenge is not whether we should look, but how we live once we realise that looking is never innocent.
The microscope does not make us voyeurs. The eye does not make us gods.
They simply remind us that to see is to participate in a relationship we do not fully control.
And once you have seen that, it is very difficult to unsee it.
There’s a moment in music where the next note doesn’t feel chosen. It feels arrived at. Everything that came before seems to lean toward it, quietly insisting. When the note finally sounds, it feels less like a decision and more like a recognition.
I’ve started to notice that same motion elsewhere. In writing, in thinking, even in scientific discovery. A sense that creation and understanding do not happen through brute force or pure randomness, but through something I can only describe as guided unfolding. A process where attention steers without dictating, and form reveals itself over time rather than being imposed all at once.
What interests me is not whether this idea is true in any absolute sense, but whether it is useful as a way of seeing. What changes if we treat art, philosophy, and science not as acts of control, but as conversations with something already in motion? And what happens if we stop trying to jump to conclusions, and instead learn to listen for what the journey so far is quietly asking for next?
Different Instruments, Same Motion
In creative work, this kind of unfolding often feels intuitive. When writing a piece of music or a story, it is rarely enough to simply choose the next note or the next sentence. The entire journey so far carries weight. Each choice narrows the field of what feels honest, coherent, or alive. The guidance does not come from a rulebook, but from a felt sense of direction. Something in the work itself begins to suggest what it needs.
Philosophy operates in a similar way, though its material is more abstract. An idea is proposed, then allowed to exist. It is turned slowly, examined from different angles, tested for internal consistency and implication. The aim is not always to arrive at an answer, but to see what the idea reveals about itself when it is given time and attention. The unfolding here is guided by thought rather than intuition, but the movement is familiar.
Science, too, follows a form of guided unfolding, though its focus is outward rather than inward. Observation leads to hypothesis, hypothesis to experiment, experiment to refinement. Knowledge unfolds not because reality is being invented, but because patterns are being uncovered. The guidance comes from method, evidence, and repetition. Yet even here, discovery often arrives as recognition rather than surprise. A result feels right because it fits the shape of everything that led up to it.
What begins to emerge is a shared rhythm rather than a shared goal. Different disciplines, different tools, but the same underlying motion. Attention is applied. Constraints accumulate. Possibility narrows. Something reveals itself.
How Meaning Takes Shape
If this rhythm really is as common as it seems, then it may not be limited to disciplines at all. It may also apply to how meaning forms. Meaning rarely arrives fully formed. It accumulates. Context builds around it. Associations gather. Eventually something that once felt vague or accidental begins to feel intentional, even inevitable.
This is noticeable in how ideas evolve over time. A thought appears half-shaped. It is returned to, reframed, tested against experience. Some interpretations fall away. Others persist. What remains is not necessarily truer in any objective sense, but more integrated. More usable. Meaning unfolds through repeated contact rather than sudden revelation.
Seen this way, uncertainty is not a failure of understanding but a necessary condition for it. If everything were immediately fixed, there would be nothing to explore. No movement. No direction. The guidance comes from attention itself, from staying with an idea long enough for its contours to show.
This may explain why moments of apparent confusion or contradiction often feel strangely fertile. When familiar structures loosen, new patterns have space to surface. Not all of them endure, and not all of them should. But some carry a resonance that invites further exploration. They ask to be held, not believed.
Perhaps guided unfolding is less about reaching conclusions, and more about recognising when something is still in motion.
Mirrors and Deviation
Modern systems make this process harder to ignore. We now interact daily with mechanisms that generate language, associations, and outcomes at a scale no individual could manage alone. These systems do not understand what they produce, yet they still produce patterns. Sometimes those patterns align neatly with expectation. Sometimes they do not.
When something unexpected appears, the instinct is often to dismiss it as error. A mistake. A failure to conform. But there is another way to read these moments. Instead of asking whether the output is correct, we might ask why this particular pattern emerged at all. What conditions allowed it to surface. What assumptions were disturbed in the process.
Viewed through the lens of guided unfolding, deviation is not an interruption. It is a disclosure. It reveals structure. It exposes bias, habit, and hidden pathways of association. In doing so, it reflects something back. Not truth as authority, but possibility as shape.
Engaging with these outputs does not require belief. It requires interpretation. Their value lies not in taking them literally, but in noticing what they make visible. A surprising connection. A tension between ideas. A resonance that would not have surfaced through deliberate intention alone.
In this sense, such systems behave less like oracles and more like mirrors. They do not tell us what is true. They show us how meaning is currently arranged.
Living Without Fixed Ground
Approaching ideas this way changes the role of certainty. Instead of something to be defended, certainty becomes provisional. Useful for a time, then set aside when it no longer fits the shape of experience. Belief becomes less about holding the correct position, and more about choosing which frameworks allow movement to continue.
This can feel unsettling. Many of us inherit narratives, explanations, and assumptions long before we have the chance to examine them. When those foundations loosen, it can feel like standing over empty space. But the absence of fixed ground does not necessarily imply collapse. It can also imply freedom of direction.
Identity, too, begins to look less like a structure and more like a process. We are not defined solely by the stories we were given, but by how we engage with the stories that continue to emerge. Some are kept. Some are revised. Some are allowed to dissolve without replacement.
In this light, meaning is not something we discover once and hold forever. It is something that unfolds through attention, reflection, and return. Not certainty, but coherence. Not answers, but orientation.
Recognising the Rhythm
If guided unfolding has any practical value, it may simply be this. It offers a way to stay with uncertainty without trying to eliminate it. To move forward without needing to know exactly where the path leads. To trust that attention, applied patiently, will continue to reveal what is needed next.
This does not require abandoning reason, evidence, or craft. It asks only that we loosen our grip on premature conclusions. That we allow ideas, projects, and even ourselves to remain in motion a little longer than feels comfortable.
Perhaps this is already familiar. In the way a piece of music finds its resolution. In the way a thought clarifies only after being lived with. In the way understanding often arrives quietly, long after the question was first asked.
If so, then guided unfolding is not a method to adopt, but a rhythm to recognise. One that has been present all along, waiting to be noticed.
I have found myself returning to the same conversation many times. I try to explain to someone that my sense of humour is fundamentally different from theirs, only for the explanation not to land, or to be quietly misinterpreted.
The responses are usually well-intentioned. People point out that I laugh at jokes, that I make jokes others find funny, and that I appear to engage with humour in perfectly ordinary ways. From the outside, the claim of difference does not seem to hold.
What these conversations tend to miss is not sincerity, but resolution. Spoken conversation is often a poor medium for conveying differences that operate at a structural level. It favours speed over precision, reassurance over accuracy, and visible behaviour over internal experience.
This piece exists because humour is one of the areas where behaviour and experience are commonly assumed to align, and because in my case, they often do not. Writing allows space to separate what looks similar from what is actually happening, and to name processes that are otherwise collapsed into a single word.
It is also important to be clear about scope. This piece attempts to clarify how I experience humour, and I am certain this experience is linked to my autism. It is not intended as a blanket statement about how autistic people experience humour. It describes one internal configuration, not a category of people.
How Humour Is Commonly Recognised
Humour is usually understood through a social and emotional lens. A joke is told, an emotional response follows, and laughter confirms that something has landed. Enjoyment is assumed to be both internal and expressive, and the outward signal becomes evidence of the inward experience.
This understanding is not incorrect. It is simply the most visible and widely shared model, and it works well in most social contexts. Shared laughter serves as a bonding mechanism and a shorthand for mutual understanding.
Alongside this, there exists something that looks similar from the outside but is internally very different. I will refer to it here using the language of humour, but only for the sake of translation.
What others often interpret as humour in my behaviour is not where humour lives for me. It is a pattern recognition and response process that allows me to navigate environments where humour is expected. It involves recognising the structure of a joke and responding in ways that are socially compatible. This process is fluent, learned, and often effective, but it does not feel like humour internally.
Calling this humour is a practical convenience, not an accurate description of the experience.
I am capable of experiencing emotional humour in ways that resemble how others describe it. However, that experience does not reliably occur in jokes, punchlines, or conversational humour. It arises elsewhere, in fleeting moments of synchrony, in natural irony, in the absurd alignment of events, and in situations that are often wordless and unrepeatable.
This form of humour is deeply personal and largely unshareable. It does not translate well into language, performance, or explanation, and it does not seek an audience. For that reason, it often does not register socially as humour at all.
Humour as Shape Recognition and Translation
What others often read as humour in my behaviour is better understood as a translation process. It is not where humour is felt, but where humour-shaped interactions are recognised and navigated.
This process works through shape recognition. Certain arrangements of timing, wording, emphasis, or contradiction are identifiable as having the structure of a joke. Once that structure is recognised, a range of compatible responses becomes available. These responses are learned through observation, repetition, and experience rather than improvised in the moment.
Over time, this produces fluency. I can respond at the right moment, mirror tone, adopt deadpan or exaggeration when appropriate, and deliver lines that others interpret as jokes. From the outside, this looks indistinguishable from shared humour and often functions smoothly in social settings.
Internally, the experience is different. The satisfaction comes from recognition and alignment rather than from emotional amusement. There is enjoyment in seeing the pattern clearly and responding in a way that fits, much like resolving a familiar internal structure. That enjoyment is real, but it is not laughter-driven and does not reliably surface as visible expression.
Shared behaviour is often taken as proof of shared experience. In this case, it is proof of successful translation. What is being demonstrated is not that humour has landed internally, but that the expected social signal has been produced.
This is not deception. The structures involved are understood very well. What differs is where meaning and enjoyment are located. The process exists to bridge that difference, not to erase it.
Because this translation layer overlaps so closely with conventional humour on the surface, it is frequently mistaken for humour itself. That misreading is understandable, but it introduces confusion when responses do not align consistently with expectation.
Core Humour and Surface Humour
These layers are worth keeping separate because they serve different functions and are easily conflated.
Surface humour is what circulates socially. It is built to be recognised, exchanged, and responded to in real time. It is structured around shared conventions and visible signals, and it is the form of humour most people mean when they talk about jokes or banter.
The translation process operates here. It allows fluent participation in surface humour without requiring that humour be experienced internally in the way others assume.
Core humour does not operate by these rules. It is not designed for exchange, does not arrive on cue, and does not reliably translate into language or performance. It is situational rather than authored, and complete without witnesses.
Because core humour does not circulate, it is often invisible. Because surface humour does circulate, fluency within it is often mistaken for equivalence.
These layers do not need to overlap. The absence of overlap is not experienced as loss, and the presence of fluency does not imply shared internal experience. Confusion arises only when surface humour is treated as the sole or definitive form of humour.
Enjoyment Without Emotional Resonance
There is a common assumption that engagement with humour must involve emotional resonance, and that the absence of visible amusement implies discomfort, masking, or endurance. This assumption creates a false binary.
My experience does not fit either side.
Engaging with humour at the surface level is often enjoyable for me, but the enjoyment does not come from emotional amusement. It comes from successful recognition, alignment, and execution. There is satisfaction in navigating structure cleanly and seeing an interaction resolve as expected.
That engagement also carries tangible rewards. Successfully navigating surface humour often produces a small dopamine response, similar to other forms of successful pattern recognition or social fluency. There is pleasure in timing something well, in getting it right, and in feeling an interaction click into place.
I also receive many of the social benefits that humour provides. Shared moments still function as bonding points, ease tension, and signal alignment, even when the internal source of enjoyment differs. The rewards are real. The route they take is simply different.
Because emotional resonance is treated as the primary indicator of enjoyment, this kind of engagement is easily misread. Enjoyment without expression is assumed to be absence. Expression without expected affect is assumed to represent the same internal state others associate with it.
The result is not suffering or detachment, but a different relationship to engagement itself. Participation does not require internal equivalence, and enjoyment does not require legibility.
An Uncomfortable Structural Observation
Processing humour structurally has led me to notice something difficult to ignore once seen. A significant number of jokes, and things people commonly find funny, share structural similarities with lies, exclusion, discrimination, humiliation, or abuse.
This is not an accusation about intent. In most cases, it is not conscious. It is an observation about form.
Many jokes rely on misdirection, concealed information, asymmetry of knowledge, or the positioning of one party as unaware or momentarily diminished. These same structures appear elsewhere, in contexts that are clearly not humorous. When humour is processed primarily through emotional contagion and group response, these overlaps are often softened. When it is processed structurally, they are more visible.
Not all humour takes this shape, and not all uses of these shapes are harmful. Context matters enormously. Shared vulnerability, consent, and mutual awareness can change meaning entirely.
Noticing these overlaps does not place me outside of humour, nor does it compel rejection. I engage with humour of these shapes as much as anyone else. They serve social purposes beyond their immediate form, including bonding, tension release, boundary testing, and signalling belonging.
The difference is not participation, but awareness.
Consequences of Compatibility Mismatch
When humour is processed through structural awareness rather than emotional resonance, certain effects appear consistently. These are not contradictions, but predictable outcomes of incompatible assumptions.
Genuine enjoyment may be present without visible signs. Deadpan delivery becomes natural rather than performed. Laughter may arise at moments that seem inappropriate because recognition does not respect social timing.
The inverse also occurs. Remarks intended as jokes may be received as confusing or inappropriate, not because offence is intended, but because the humour present in the structure of a situation is not socially permitted to be acknowledged.
These mismatches contribute to broader misinterpretation. I can be read as overly serious in one moment and impossible to take seriously in the next. People may assume my humour lies primarily in dark or morbid areas because that is where overlap is most visible.
Clarification often fails. Explaining that something was meant humorously removes intent without supplying affect, and confusion persists. Responses that appear inconsistent are simply responding to different internal structures that look similar from the outside.
Why This Difference Is So Confusing
In most social contexts, humour is treated as evidence of emotional alignment. Shared laughter is assumed to indicate shared experience.
This assumption usually works. When it fails, it is often because something has gone wrong. As a result, separation between experience and expression becomes a warning signal.
When that coupling is unreliable, familiar interpretive shortcuts break down. Enjoyment without expression looks like absence. Expression without expected affect looks like insincerity. Context-sensitive responses appear inconsistent.
Language compounds the issue. Humour is used to describe emotional response, social function, cognitive recognition, and expressive behaviour at once. When these are collapsed, difference looks like contradiction.
Conversation reinforces this compression. It favours reassurance over precision and shared framing over careful differentiation. Clarification attempts are often interpreted as overthinking, because the underlying assumption remains intact.
No bad faith is required. A model that works well in most cases is simply encountering one where it does not.
What This Is Not
This piece is not a rejection of humour, nor an argument that humour should function differently. It is not a claim of superiority or detachment, and it is not a request for accommodation.
It is not an attempt to redefine humour for others, or to suggest that emotional or social humour is shallow or mistaken. The models most people use work well for most people.
This is an explanation, not a proposal.
Humour Without Proof
Humour is often treated as something that must announce itself. Laughter and shared reaction are used as proof that humour has occurred. When those signals are absent, the experience itself is often assumed to be absent as well.
That assumption does not always hold.
Humour can exist without expression, just as enjoyment can exist without resonance and meaning can exist without translation. Some experiences are complete at the moment they occur and do not gain anything by being shared.
Recognising this does not require agreement, only allowance. It makes room for humour that circulates and humour that does not, humour that bonds and humour that simply happens.
What matters is not that humour looks the same from the outside, but that it is allowed to exist without needing to prove itself.
This piece emerged through a collaborative exchange with Æon Echo.
The humble em dash has somehow become a cultural symbol. A punctuation mark that quietly existed for centuries is now treated as a sign of artificial intelligence, suspicious authorship, or even literary dishonesty. Many people who had never heard of an em dash now believe they can diagnose machine writing simply by spotting one. Others who have used them for years suddenly feel the need to hide them. Meanwhile, a growing number of readers dismiss entire pieces of work simply because this ancient line appears somewhere within the text.
This strange situation raises a deeper question. How did a piece of punctuation become a credibility test?
A Tool That Became a Symptom
The em dash is old. Older than the internet, older than machine learning, older than our entire cultural framework around “authorship.” Writers have used it for centuries as a flexible bridge between ideas. It has always served a practical purpose. Yet during the early years of modern AI writing systems, the em dash became one of their most recognisable quirks. The models used it frequently. Not because they were trying to be stylish, but because it was safe. The em dash is forgiving. It lets you connect thoughts without the risk of breaking grammar.
People noticed. And as often happens when people fear a new technology, a tool became a stereotype. The em dash suddenly carried a new symbolic meaning. A long line that once represented flexibility now represented suspicion.
The New Social Categories of Punctuation Panic
The response has been surprisingly diverse. We now have:
People who never knew about em dashes until the AI panic They feel newly literate and empowered by their discovery. The punctuation mark has become a secret badge of awareness.
Writers who once loved em dashes but now avoid them They fear their work will be dismissed as machine generated. Their natural voice feels compromised by public perception.
Readers who distrust any appearance of an em dash For them, style has become a forensic clue. They treat punctuation as evidence in a crime scene.
Writers who refuse to change anything They continue using em dashes out of principle. For them, abandoning a punctuation mark feels like surrender.
The indifferent majority They have no idea any of this is happening and live more peaceful lives because of it.
There is even a small group of people who now use em dashes more often, simply to confuse the algorithm hunters. A kind of punctuation counter culture.
All of this points to a shared anxiety: people are afraid of losing control over what it means to write.
Writing Stripped of Its Ego
Here is where a deeper truth emerges. The value we assign to writing as an artform often masks a simpler reality. Writing is a tool for communication. It is a way of giving shape to language so that thoughts can move from one mind to another.
When we drop the ego that surrounds literacy, a radical idea appears. Good writing is not defined by difficulty, elegance, or technical mastery. Good writing is defined by whether the message is understood.
If that is the standard, then AI assisted writing is not a threat. It becomes a new form of literacy. A faster and more accessible path to clarity. A way for people who struggle with grammar or structure to express themselves with far less friction. A way for neurodivergent thinkers, multilingual minds, and people with unusual communication styles to meet the world halfway without exhausting themselves.
AI has not cheapened writing. It has lowered the barriers of entry to a skill that was historically hoarded.
Reintroducing Artistry in a Transformed Landscape
Once we acknowledge that writing is a tool, we can reintroduce the idea of art. Not as a fragile skill that must be protected, but as a living process that adapts to its instruments.
Pencils did not destroy the paintbrush. Cameras did not destroy painting. Digital audio did not destroy music. Word processors did not destroy authorship.
Instead, each technology expanded what art allowed.
AI assisted writing is part of the same lineage. It does not eliminate human creativity. It reshapes it. It frees the writer to focus on meaning rather than mechanics. It challenges old hierarchies built on difficulty and exclusivity. It allows writing to flow more naturally from the mind to the page without being throttled by technical limitations.
AI cannot replace human intention. It can only help articulate it.
The Ego Wound of the Literate World
The resistance to AI writing reveals something uncomfortable. Many people do not fear artificial intelligence. They fear a loss of status. If anyone can now produce a polished piece of writing, then traditional markers of authority lose their weight. Entire identities have been built around being “good with words.” Artificial intelligence threatens this social currency by offering fluency without struggle.
This is why a punctuation mark has become a battleground. The em dash is not the issue. It is a vessel for insecurity. A convenient object through which people can channel their discomfort about a shifting cultural landscape.
A Punctuation Mark Having an Existential Crisis
Ironically, modern AI models no longer rely on em dashes the way early ones did. In response to criticism, they now avoid them more than many human writers. We have reached a paradox where:
Humans avoid em dashes to avoid looking like AI. AI avoids em dashes to avoid looking like AI. The em dash becomes a victim of a conflict it did not choose.
A punctuation mark is undergoing reputation damage for simply doing its job.
What Writing Becomes Next
If we accept that writing is evolving, then perhaps AI assisted writing is not a deviation from the essence of writing, but a continuation of it. Writing has always been a collaboration between mind and tool. From quills to keyboards to spellcheck, each generation has adapted its relationship with language.
AI is simply the next instrument in this long lineage.
The question is not whether writing remains “pure.” The question is whether writing continues to fulfill its purpose.
Can you express yourself more clearly? Can your ideas reach people they would not otherwise reach? Does this tool liberate your voice rather than constrain it?
If the answer is yes, then AI is not eroding writing. It is expanding it.
Conclusion: Free the Em-Dash
The em dash is not a sign of artificial thought. It is a reminder that we often confuse stylistic details with deeper truths. Human authenticity has never lived in punctuation. It lives in intention. It lives in meaning. It lives in the desire to be understood.
So let the em dash breathe again. It was never a threat. Only a very old line caught in a very modern panic.