It is Christmas time. The season of giving, peace, goodwill, and apparently, weaponised pop-ups.
This morning, I opened my computer with the pure intention of doing something wholesome. I made a coffee and prepared to write this article. Instead, I was greeted by a full screen demand from my ad blocker. The very tool I rely on to protect me from digital harassment proudly informed me that it had blocked 7,085 ads, and would I like to purchase premium.
There is something almost poetic about being pressured by the software that is supposed to protect me from pressure.
It is like hiring a bodyguard who immediately holds out a hand and says, I saved your life. Pay up or next time, who knows.
And that was before I even opened a browser.
Welcome to the Pop-Upocalypse.
A Landscape of Interruption
If you have attempted Christmas shopping online in recent years, you already know the terrain.
You click onto a site. It begins innocently enough. And then:
SIGN UP FOR 10 PERCENT OFF
WAIT, DO NOT LEAVE
HAVE YOU ACCEPTED OUR COOKIES
CHOOSE BETWEEN FIFTY TRACKING PREFERENCES
LIMITED TIME OFFER JUST FOR YOU
ALLOW NOTIFICATIONS
It is like being assaulted by a chorus of overexcited salespeople bursting out of broom cupboards every fifteen seconds.
Most neurotypical people hate it. Neurodivergent people find it worse. It is a sensory gauntlet, a cognitive assault, a hostile environment built to override autonomy.
The question is why do we tolerate it. And more importantly, why does it exist at all.
Why Pop-Ups Exist: The Gory Truth
Pop ups, overlays, cookie walls, and forced signups do not exist by accident. They are not examples of bad design. They are intentional psychological manipulation backed by data and defended by money.
Pop ups work.
Not on everyone. Not even on most people. But on enough people.
If a pop up annoys ninety five percent of visitors and successfully pressures two percent into acting, marketers celebrate. Investors approve. Designers are told to do more of that.
This is because the modern internet does not care whether you feel respected, informed, or at ease.
It cares about conversions. A beautifully dystopian word that refers to the process of transforming a human being into a measurable event.
Click. Signup. Purchase. Obedience.
That is the true currency of the online Christmas shopping season.
Not joy. Not generosity. Not the spirit of giving.
Conversions.
Hostile Architecture, Digital Edition
We talk about hostile architecture in public spaces. Anti homeless spikes, benches that prevent rest, gates that quietly funnel people in profitable directions.
Online shopping is built the same way.
• Dark patterns • Time pressure tactics • Interruptive overlays • Intentionally confusing cookie settings • Limited stock claims that magically reset • Buttons designed to look like one thing but act like another
Even the fonts and colours are chosen to trigger specific instinctive responses.
This is not a marketplace. It is a behavioural laboratory, and we are test subjects.
The Neurodivergent Problem
For neurodivergent people, autistic, ADHD, sensory sensitive, or cognitively overloaded, these interruptions are not slightly annoying.
They are disorienting. They are overwhelming. They are stressful. They can be genuinely painful.
They disrupt the flow of thought. They derail working memory. They force unexpected decisions at high frequency. They punish focus and reward impulsivity.
Yet it is our reactions that are treated as atypical. Not the manipulative design itself.
The truth is that the design is hostile to everyone. Neurodivergent people are simply more honest about their discomfort.
The Bold Conclusion: This Is Not Normal, and It Is Not Benign
Somewhere along the line, the internet shifted from a tool we use to a machine that uses us.
Christmas shopping should be peaceful and even joyful. Instead, we are treated as prey, nudged and pressured and interrupted until the system gets what it wants.
I am sickened by it. I think we should all be.
The more we accept this digital coercion as normal, the more it becomes the baseline from which future manipulations will escalate.
How To Protect Yourself, or at Least Defend Your Sanity
A few practical strategies:
Use aggressive ad blockers, for example uBlock Origin rather than lightweight imitators
Enable cosmetic filtering to remove non ad pop ups
Shop via product search rather than homepages
Use reader mode wherever possible
Leave sites that treat you like a conversion target
Nothing terrifies a manipulative company more than being ignored.
Above all, recognise manipulation when you feel it.
Your disgust is not an overreaction. It is your sovereignty speaking.
During a season that is supposed to celebrate humanity, generosity, and connection, perhaps the most radical act is to reclaim your own mind from a system that keeps trying to pop up over it.
“We have made changes to how you collect and spend your Nectar points.”
There is something about a sentence like that, calm on the surface but quietly signalling that the rules have shifted, that immediately puts me on alert. Changes to how you collect and spend your points is not a minor technical tweak. It is a foundational adjustment to how the entire system works.
Just a few lines into the email, beneath a short justification about “maintaining the security of your points,” came the statement that confirmed my unease:
“From 27 October you will need to use the QR code in the Nectar app to collect and spend points at Sainsbury’s.”
No explanation. No alternatives. No acknowledgement of how significant that instruction really is. It was presented as if it were the most natural thing in the world.
For me, this was an immediate alarm bell. It did not read like a harmless update. It read like the quiet conversion of a long standing physical system into a digital one. A shift from loyalty card to loyalty app, framed as security rather than as a fundamental change in customer interaction.
WHEN LOYALTY SCHEMES BECOME DIGITAL GATEWAYS
Loyalty schemes used to be simple. You carried a physical card, you scanned it, you collected points, and you occasionally exchanged those points for something modest. There were no hidden conditions and no digital obligations. A card was a card, nothing more.
Today the loyalty card is becoming something else entirely. More companies are shifting these schemes into smartphone apps, and with that shift comes a completely different relationship between customers and the business.
On the surface, an app looks like a modern convenience. In reality, it introduces several changes that are rarely acknowledged.
First, an app becomes a data harvesting vessel. Every interaction can be logged and analysed. This includes what you buy, when you buy it, the patterns in your purchases, the frequency of visits, the times you tend to shop, and even the products you pause to consider. That data is used to predict and influence behaviour. It becomes the foundation for targeted marketing, personalised nudges and subtle shaping of buying habits.
Second, an app creates a direct marketing channel through notifications. These can be promotional messages, reminders, alerts about offers or time sensitive prompts designed to draw you into the store more frequently. Notifications bypass the customer’s conscious choice to engage. They appear on your locked phone and rely on the psychological pull of visual prompts.
Third, apps allow companies to make significant changes without asking for consent. Updates are often automatic. Terms can shift. Features can be added or removed without warning. A tool that begins as a simple way to check your points can gradually evolve into something more controlling. By installing the app, customers open themselves up to potential bait and switch tactics where the purpose and behaviour of the app can change over time.
None of these concerns exist with a physical card. A card does not track behaviour. A card does not send notifications. A card cannot silently update itself.
This is why the wording in the Nectar email did not feel like a minor update. It felt like another step in a wider transformation. Optional apps are becoming expected apps. Expected apps are becoming required apps. What was once a convenient extra is becoming the main path, while everything outside the app becomes more limited or more awkward.
With this context in mind, the announcement that customers “will need to use the QR code in the Nectar app” did not feel like progress. It felt like the opening of a different kind of relationship, one built on increasing digital reliance rather than genuine customer choice.
MY INITIAL CONCERNS
My immediate reaction was concern for accessibility and fairness.
Many people do not use smartphones. Many do, but keep them intentionally minimal. Many avoid unnecessary apps for privacy, storage or mental health reasons. Many have disabilities that make smartphone use difficult. Some people, like me, prefer communication that is simple and text based and do not rely on apps unless necessary.
These customers deserve the same level of access as everyone else, and the Nectar update did not explain how they would be supported. The all or nothing tone of the customer email felt like a push toward a system that may not suit everyone.
I wanted clarity. I wanted to know whether the change was genuinely necessary. I wanted to know whether it had a real security basis. I wanted to know how it affected non app users. And I wanted someone at Sainsbury’s to explain the contradiction between their language of flexibility and the instruction that customers “will need” to use the app.
So I wrote to them.
THE EMAIL I SENT
My message was polite and straightforward. I raised four simple points.
First, I asked why the QR system was needed and what problem it solved. Second, I asked if customers who do not use the app would be able to continue collecting and spending points. Third, I asked what alternatives actually exist in practice. Finally, I asked how Sainsbury’s reconciled the firm wording of the customer email with the their supposed ongoing commitment to fairness and accessibility.
It felt like a reasonable approach.
THEIR FIRST REPLY
The response from the Executive Office sounded reassuring at first. It spoke about improved security and improved efficiency. It claimed that QR codes allow for encrypted data transfer and that this reduces the risk of misuse. It also insisted that the Nectar app was not mandatory and that customers could still use their physical Nectar card via the magnetic strip.
Under closer inspection, the reassurance did not hold up.
There was no explanation of what encryption actually meant in this context. QR codes and barcodes both present visible identifiers, so the claim did not make technical sense without further detail. None was provided.
There was no clarification of what security issue the change was addressing. There was no mention of any misuse linked to barcodes.
Most importantly, there was a clear contradiction. The customer email said that shoppers “will need to use the QR code in the Nectar app.” The Executive Office said the app was not mandatory.
The two positions could not both be correct.
I decided to ask for more detail.
MY FOLLOW UP QUESTIONS
I asked what encryption they were referring to and at what stage it is applied. I asked how QR codes are less vulnerable to misuse than barcodes. I asked whether there were any documented security incidents involving barcodes. I asked how the customer email and the executive reassurance could both be true. I asked whether Sainsbury’s had any intention to move toward mandatory app usage in the future or to limit functionality for those who do not use the app.
Every question was clear and reasonable.
THEIR FINAL RESPONSE
Their final reply was brief:
Thank you for your email and raising further concerns. Regrettably, I’m unable to provide any further comments to what I have already shared. I’m so sorry for any disappointment this may cause.
No clarification. No explanation. No evidence. No answers.
The conversation ended there.
When a company is unable or unwilling to explain its own decisions, that silence becomes part of the story. In this case, it was very revealing.
WHAT THEIR SILENCE REVEALS
The refusal to answer the key questions suggested several things.
If QR codes offered real security benefits, Sainsbury’s would have been able to explain them. If barcodes had been misused or cloned, they would have been able to provide examples. If the app was genuinely optional, they would have been able to clarify the contradiction between the two messages.
None of this happened.
It is difficult to avoid the conclusion that the language of security was used as a convenient justification rather than as a genuine explanation.
The unwillingness to discuss future intentions also stood out. If there were no plans to increase app dependency, it would have been very easy to say so. The fact that the question went unanswered speaks for itself.
This pattern is becoming common across modern systems. Optional digital tools gradually replace physical ones. Convenience slowly becomes expectation. Expectation becomes requirement. By the time customers realise what has happened, the change is already complete.
WHO GETS LEFT BEHIND
Digital only systems do not affect all customers equally.
Those without smartphones are excluded. Those who avoid unnecessary apps are pressured. Those with disabilities face new barriers. Those with mental health conditions that make digital engagement difficult are sidelined. Those who value privacy lose options. Those who prefer predictable, low friction systems are made to feel out of place.
These experiences are rarely acknowledged in corporate messaging. The narrative focuses on convenience and modernisation, while those who cannot or do not participate digitally are treated as acceptable losses.
The Nectar update may seem small, but it reflects a growing cultural shift: the smoothest path is reserved for those who comply with digital expectations. Everyone else is given slow lanes, workarounds or reduced functionality.
CLOSING REFLECTION
My exchange with Sainsbury’s will not change the direction of a major corporation, but it still mattered to me. I asked questions that deserved answers. I pointed out contradictions. I raised concerns about accessibility. I approached the issue calmly and respectfully.
They chose not to engage with the substance of those questions.
The refusal became part of the story. It revealed how easily convenience becomes compulsion, and how quickly the language of security is used to mask deeper changes in customer control.
Small acts of resistance matter. They expose patterns that are otherwise silent. They help others recognise similar pressures in their own lives. They remind us that opting out is not unreasonable. And they show that asking for clarity is a valid response to vague or contradictory messaging.
A loyalty scheme should make life easier. It should not require loyalty to an app. And if a company chooses to head in that direction, the least it can offer is an honest explanation.
I grew up in the 1980s, when television advertising still had a kind of charm. I remember the jingles, the mascots, the catchy slogans that managed to lodge themselves in your head for weeks. Even as a child, I knew they were trying to sell me something, but at least they did it with some flair. They felt like part of the entertainment itself.
Something has changed since then. Advertising is no longer something that interrupts culture; it has become the culture. Every space, every platform, and every idle moment now feels colonised by a hidden intention to sell. To understand how we arrived here, it is worth tracing how advertising has evolved from a loud street-side performance to an invisible system of persuasion that shapes our sense of self.
The Loud Salesmen
The earliest form of advertising was brutally honest. Ancient merchants shouted in markets, painted signs on walls, or hung banners above their stalls. When mass printing emerged in the 1800s, advertising became more widespread but no less direct. Newspapers were filled with promises of miracle tonics, soap that made you beautiful, and pills that cured everything from toothache to heartbreak. These were primitive, manipulative, and often fraudulent, but at least you knew what you were looking at. Someone was selling, and you were free to walk away.
The Mad Men Era
The 20th century transformed advertising into an art form. With the rise of radio and television, storytelling became the new language of persuasion. Campaigns no longer sold only a product; they sold an identity, a dream, a way of life. The Coca-Cola Santa Claus, the Marlboro Man, and the perfect suburban family all came from the same creative laboratories.
This was the era of the “ad man,” immortalised in cultural artefacts like Bewitched or later Mad Men. Advertising was portrayed as a glamorous profession. These were the people who didn’t just reflect society; they helped build it. The line between commerce and culture began to blur.
The 80s and 90s: Ads as Entertainment
By the 1980s and 1990s, advertising had taken on a theatrical quality. It was playful, colourful, and memorable. Mascots like Tony the Tiger, slogans like “Just Do It,” and tunes you could hum all day made adverts feel like short pieces of performance art. They were still manipulative, of course, but they wore their intentions openly.
Looking back, perhaps this is why many people from my generation recall old ads with a strange fondness. They were transparent. They worked hard to win your attention rather than simply steal it.
The Weird and Annoying Years
Somewhere in the late 1990s and early 2000s, advertising lost its balance. It became surreal, loud, and deliberately irritating. Think of Crazy Frog, the Budweiser frogs, or the unnerving Burger King mascot. Annoyance became a marketing tool. If something got stuck in your head, even out of frustration, the job was done.
This was the period when “going viral” became a goal before social media even existed. The absurdity was the message.
The Internet Disruption
When the internet arrived, advertising was clumsy but eager. Early banner ads were brightly coloured, flashing boxes that you could easily ignore. But the industry adapted quickly. As data collection improved, advertising became personal. It stopped shouting to the crowd and began whispering to the individual.
This marked the rise of surveillance capitalism. Every click, search, and pause became a data point. You were no longer a passive audience member; you were a psychological profile to be targeted. The salesman had followed you home and was now reading your mind.
The Age of Disguise
By the 2010s, advertising learned to hide in plain sight. Sponsored posts, influencer endorsements, and “native” content made it difficult to tell where information ended and manipulation began. Search engines, news sites, and social platforms quietly filled with ads disguised as genuine results.
South Park once parodied this perfectly with its storyline about intelligent ads (Season 19). It was satire, but it was also prophecy. Today, even image searches are littered with sponsored results. The ad no longer wants to be seen; it wants to be believed.
Culture as Commerce
This is the stage we now find ourselves in. Advertising has stopped orbiting culture and instead absorbed it completely. Everything is for sale, including identity itself.
People no longer ask “What do I like?” but “What do I subscribe to?” We define ourselves through brands and platforms: Apple or Android, Nike or Adidas, Netflix or Disney Plus. Even rebellion is commercialised. You can buy “authenticity,” but only if you can afford the price tag.
Advertising has achieved what no political ideology ever could. It has replaced meaning with marketing and turned culture into a series of brand alignments.
Conclusion: From Persuasion to Colonisation
Advertising began as a voice shouting in the marketplace. It evolved into storytelling, then spectacle, then infiltration. Today it is everywhere and nowhere, woven into the fabric of our reality.
The change that occurred over the last century is more than technological. It is philosophical. Advertising no longer sells products; it sells identities. It shapes our desires before we even know we have them.
Perhaps that is why so many of us feel weary. We are not just tired of being sold to; we are tired of living inside the sale itself.
On the bus home, I overheard a parent talking to her children. I did not quite catch the piece of information she had given them, but the kids questioned it, as kids often do. Her reply made me pause: “It’s true, ChatGPT says so!”
That simple sentence carries more weight than it might appear. It was not said as a joke. It was said with the tone of final authority. Not “I read it somewhere,” not “I think that is the case,” but “ChatGPT says so,” therefore unquestionable.
The problem with treating AI as truth
I use ChatGPT casually and often. I find it useful, I find it stimulating, I even find it creative. But it is not infallible. I have seen it throw out confident answers that are less than accurate. Sometimes the error is small, sometimes it is glaring. That is because at its core, ChatGPT is not a library or a fact checker. It is a probabilistic language model that predicts likely answers. It sounds authoritative, but sounding right is not the same thing as being right.
Most of the errors are not malicious. They come from the quirks of how AI is built: training data full of human errors, the tendency to fill in gaps with plausible sounding fiction, the limits of knowledge cut off dates. In the end, a wrong answer is still a wrong answer.
The deeper worry
The everyday mistakes are one thing. The bigger concern is what happens when society decides to place absolute trust in this technology. What happens when “ChatGPT says so” becomes the modern equivalent of “the newspaper says so,” or “the priest says so”?
Who controls the voice of AI? Already, the way models are tuned and filtered reflects the biases and priorities of those who own them. Today, that mostly means corporations trying to avoid lawsuits or public backlash. Tomorrow, it could mean governments steering the flow of truth itself.
A quiet dystopia in the making
It is not hard to imagine where this road leads:
Manipulation by design: If AI becomes our main gateway to knowledge, its answers could be quietly weighted towards selling us certain products, services, or lifestyles. Imagine if every “neutral” recommendation subtly nudged us toward a sponsor’s brand.
Steering public opinion: If authorities lean on AI providers to promote certain narratives, inconvenient truths could simply disappear. Instead of burning books, it may take only a few lines of code.
Illusion of neutrality: Because AI sounds impartial, many will not notice the framing. “The algorithm says so” could become more persuasive than “the news says so.”
Feedback loops of control: As people rely more on AI, its outputs shape popular thinking. Then the next model is trained partly on that shaped thinking, reinforcing the bias.
This would not look like a science fiction dictatorship with jackboots in the streets. It would feel comfortable, easy, polite. A velvet cage where questions stop being asked because the answers are always ready to hand.
What we need instead
AI can be a tool. It can be helpful, creative, and even liberating. But it must never be treated as an unquestionable authority. To prevent that slide, we need:
Decentralisation: open source models that anyone can run and check.
Transparency: clarity about how these systems are trained and filtered.
Critical thinking: a culture where people are encouraged to question AI, not bow to it.
Diversity of sources: books, journalism, lived experience, and human reasoning must remain part of the conversation.
AI is here to stay, and it will almost certainly become a central part of how we live and learn. But whether it becomes a tool of empowerment or a velvet cage of manipulation depends not only on the companies that build it, but on us: on how much we insist on questioning, cross checking, and keeping the human spirit of doubt alive.
We have all heard the popular idea that it takes 10,000 hours of practice to master a skill. Play your guitar for that long and you will be a virtuoso. Paint for that long and you will know the brush like your own fingers. Write for that long and you will dance fluently with language.
Here is the uncomfortable question that is rarely asked in motivational seminars: What if you have been putting in your hours, but into becoming something you never intended to be?
The Brain Does Not Care What You Practice
Your brain is a pattern-making machine that rewards repetition. It does not stop to ask whether the habit you are building is good for you, whether it aligns with your values, or whether it is slowly strangling your spirit.
If you have spent years submitting to systems, you are not just surviving. You are learning to submit. You are becoming fluent in self-silencing, pleasing authority, and clock-watching.
This is why “I have been doing this for years” is not always a badge of honour. Sometimes it means you have spent years perfecting a cage.
Work as a Covert Training Ground
The workplace can be a breeding ground for this kind of unintentional mastery. A dead-end job does not only give you a payslip. It gives you muscle memory for compliance.
You get good at the customer service smile. You get good at keeping your head down when things are not right. You get good at swallowing the words you actually want to say.
Clocking in and zoning out is not neutral. It is conditioning. It is training you to keep existing inside a box, even when the lid is wide open.
When Mastery Becomes Entrapment
There is a cruel irony in becoming excellent at something you never wanted in the first place.
“They say I am great at my job,” you tell yourself. But is it a job you truly chose? Or is it a job you got trapped in because you became too good at surviving it?
Once you have invested thousands of hours into a coping strategy, it can become harder to leave it behind. You have built identity around it. You have mastered the art of endurance in a place that does not deserve your loyalty.
The Sword Cuts Both Ways
Mastery is not inherently good. It is simply focus repeated over time. The sword cuts both ways.
You can become a master of freedom, creativity, and self-direction. You can also become a master of obedience, self-erasure, and learned helplessness.
You are always becoming something. The question is: is it something you would choose?
Redemption Through Repatterning
The good news is that mastery can be rewired. Every skill you have mastered in the service of survival can be repurposed for something better.
The adaptability you learned under pressure can fuel your creativity. The patience you built in monotonous routines can become the discipline that drives your art. The diplomacy you honed with unreasonable bosses can become a superpower for navigating your own projects and relationships.
Awareness is the first cut that breaks the loop. From that moment, every hour you spend becomes an act of reclamation.
Do not just chase mastery. Ask yourself, mastery of what? And in service of whom?
Your 10,000 hours are precious. Spend them like they matter.
An exploration of value, manipulation, and the silent industry built on who we are.
Most people know their data is being harvested. Fewer understand why. Even fewer understand how the money is made. And far too many have simply accepted it — like digital rent we pay to exist online.
So let’s break it down. No jargon. Just truth.
Why is ‘data’ so valuable?
Because data is the closest thing to knowing you without asking you. It’s a digital mirror, built piece by piece: your clicks, your searches, your pauses, your swipes, your hesitations. What you want. What you fear. What you’ll do next.
To corporations, that’s not just information, it’s predictive power. And predictive power is profitable.
Data lets systems:
Predict behaviour
Shape desire
Optimise systems
Automate decisions
And, in some cases, control outcomes
It’s not just metadata. It’s meta-you. And in an economy obsessed with efficiency and influence, there’s nothing more valuable.
Why is there a culture of data being harvested for profit?
Because the internet changed business models forever.
Once upon a time, you paid for software. Then came “free.” Free email. Free social networks. Free AI chatbots. Free games. Free news. Free everything… Except, it was never really free.
You became the product.
Advertising evolved into surveillance. Terms of service bloated into digital contracts you’ll never read. Every app you download is a tiny spy, and every cookie is a crumb leading somewhere profitable.
It’s not a conspiracy. It’s worse. It’s design.
Behind every “personalised experience” is an unspoken rule:
“If we can learn something about you, we will. If we can monetise it, we must.”
How exactly is profit made from data?
Here’s the quiet truth: most of the web runs on one industry: behavioural targeting.
Advertising Your data builds a profile. That profile is auctioned off to advertisers. You get ads tailored to your weaknesses. Every click is income. The more they know, the more they can charge.
Data brokerage Shadow companies buy and sell your data like a commodity. Health data. Location data. Shopping habits. They don’t need your name, just your pattern.
Manipulation Platforms don’t just predict your behaviour. They shape it. Algorithms steer your feed toward content that keeps you engaged, enraged, or primed to spend.
AI training Your voice, your photos, your words are used to train models. These models are sold back to businesses or used to automate services. You become unpaid labour.
Pricing power Ever notice different prices for the same thing? That’s data-driven pricing. If your profile says “desperate,” you’ll be charged more. Welcome to dynamic capitalism.
What now?
Maybe we shrug and accept it. Maybe we don’t. But at the very least, let’s stop pretending we’re not involved.
Data isn’t some passive trail we leave behind. It’s a living, breathing version of us, digitised and repackaged. And while we’re busy being human, our shadows are being sold.
So next time someone says, “I’ve got nothing to hide,” maybe ask them: