How TikTok, Infinite Scroll, and Algorithmic Design Are Quietly Rewiring Our Sense of Reality - And Driving a Surge in Derealisation

Derealisation is rising, and today's feeds may be part of the cause. This deep investigation reveals how short-form video, infinite scroll, and algorithmic design overwhelm perception, distort time, and quietly reshape our inner lives.

How TikTok, Infinite Scroll, and Algorithmic Design Are Quietly Rewiring Our Sense of Reality - And Driving a Surge in Derealisation

At 3 a.m., the glow of a phone screen cuts through the dark. On TikTok, the algorithm serves a seamless reel of close-up faces, sped-up storytimes, and whisper-tone voiceovers. The next clip shifts perspective again - now a shaky first-person view, a flash of light, a jump-cut to something else.

In Reddit threads such as r/derealization and r/Depersonalization, young people describe what happens next: the sense that their own surroundings suddenly flatten, as though the world has slipped one frame out of sync.

  • "After scrolling for hours, I got this feeling like everything was behind glass," one user wrote.
  • "Even my hands didn't feel real."
  • "TikTok puts me in a trance. When I stop, the world feels fake," another added.

These are not isolated anecdotes. Posts with the tags #derealization, have collectively drawn tens of millions of views on TikTok and Instagram, where creators narrate the same eerie sensation - disconnection from body or environment after hours of short-form scrolling.

Researchers haven't yet established a direct causal link between algorithmic content and dissociative symptoms. But a growing number of clinicians and journalists have begun to notice the overlap between feed design and derealisation-like experiences.

Is the Internet Dying? A Psychological Deep Dive
As automated traffic overtakes human activity online, the web feels less alive than ever. Explore the Dead Internet Theory, rising AI-driven content, and what this shift means for our mental health, emotional connection, and digital wellbeing.

MindSite News reported on how TikTok's algorithm amplified videos about Dissociative Identity Disorder (DID), sometimes promoting traumatic-experience content to vulnerable users and "blurring the line between education and contagion."

The phenomenon has begun to attract informal terminology: algorithm-induced derealisation - a shorthand for when the interface itself seems to erode one's sense of presence. But there’s a bigger theme here. 

Something I keep find myself circulating back to in my reporting: is how fast we are changing. For years I've been documenting what happens at the collision point between human psychology and digital design, and the same pattern keeps appearing: the human nervous system is updating far slower than the tools we're placing in front of it. Experiences are evolving in months; the language to describe them takes years; formal diagnosis takes decades.

This gap matters. Because when you're sitting in the therapy room with someone who can tell you, in vivid detail, what it feels like to "wake up from the feed" or "drop back into reality after three hours of scrolling," you quickly realise how useless the old frameworks can feel.

Your clients, and maybe your loved ones are already living in conditions that the textbooks haven't even named yet. The mental health profession is accustomed to slow-moving knowledge-revisions every 10-15 years, steady consensus, cautious language. But the digital environment is mutating weekly.

It means clinicians, parents, educators-all of us-can no longer rely on inherited language to understand what's happening to people right now. We have to build a vocabulary in real time, often by listening to the people who feel these shifts before any institution has caught up.

So here's the tension I want to explore: What happens when an entire generation begins reporting symptoms that don't map neatly onto anything psychiatry has a name for-yet feel undeniably real?

And if derealisation is being shaped not just by trauma or anxiety, but by the architecture of our digital environments... what else is already changing inside us, quietly, without a proper name?

What is derealisation? Naming the Feeling

Clinically, derealisation describes that eerie, disorienting moment when the world feels... off. Not unreal exactly, but thinner, flatter, as if the emotional colour has drained out of the room. Its close relative, depersonalisation, turns the lens inward: the sense of watching yourself move and speak from a slight distance, as though your life has slipped into a third-person view.

The DSM-5 groups these phenomena under Depersonalisation/Derealisation Disorder (DPDR), defining it as "persistent or recurrent experiences of unreality or detachment" typically linked to trauma, severe anxiety, or overwhelming sensory input. Though classified as a dissociative disorder, these symptoms also show up across depression, panic disorder, and PTSD.

Decades of work - from researchers like Mauricio Sierra and German Berrios - has mapped out the territory. Episodes can last anywhere from minutes to years, and roughly 2% of people experience it as a chronic condition. When observed using MRI, the experience leads to reduced activity in brain regions involved in emotional processing, such as the insula and limbic system, during derealisation states. So the core experience isn't new. What's changing is the environment in which it emerges.

Where older case studies linked derealisation to extreme stress, psychedelics, or sensory deprivation, younger people are increasingly pointing to something far more banal - and far more ubiquitous: prolonged digital immersion. Hours spent inside perfectly tuned, high-intensity feeds. Days shaped by algorithmic pacing and the emotional whiplash of infinite scroll.

A Brief History of Nostalgia
From 17th-century homesickness to TikTok throwbacks, nostalgia has evolved into a cultural coping mechanism. Here’s why our collective gaze keeps turning backward - and what it says about us.

On TikTok, and now Instagram, a whole visual grammar has formed around this feeling: the "DPDR aesthetic." Washed-out filters. Slowed-down ambient audio. Captions like "I feel like I'm dreaming but I'm awake." These videos aren't diagnoses; they're distress signals - attempts to render a dissociative state visible in the only language a platform affords.

And clinicians are beginning to notice the consequences of this new vocabulary. A 2025 paper in Human Psychopharmacology: Clinical and Experimental reviewed the rise of self-diagnosed dissociative disorders online and observed how many users describe their symptoms through TikTok's popular aesthetics and linguistic conventions rather than clinical criteria. The authors warn that while these communities can offer validation, they can also intensify symptoms through social mirroring and the algorithm's tendency to feed back whatever someone lingers on.

For mental-health practitioners, this cultural shift poses a quiet but profound challenge: What happens when language born inside the feed becomes the primary vocabulary clients use to make sense of their distress outside it? Because if we're relying on clinical categories that move at the pace of committees - while dissociative experiences are evolving at the pace of platforms - we risk talking about two different realities entirely.

Anatomy of the Algorithmic Trance

Modern feeds aren't media; they're perceptual environments - engineered, tuned, and continuously optimised to shape how the brain feels from one second to the next. We talk about scrolling as if it's passive, but the system is doing far more than showing us videos. It's manipulating arousal, novelty, and emotional volatility with algorithmic precision.

TikTok is the clearest example. The average video is now around 21 seconds long, and users swipe through hundreds of clips in a single session. Each swipe carries the tiny thrill of maybe: maybe the next clip will shock, soothe, amuse, disturb. Psychologists call this the variable reward loop - the same reinforcement structure that underpins gambling addiction. It's the uncertainty, not the payoff, that keeps the nervous system hooked. And that nervous system pays a price.

Rapid camera shifts, sudden audio jumps, and tight, face-dominant framing strip away spatial context. Sensory integration becomes harder. A recent study found that overstimulation and loss of attentional control are reliable triggers for transient dissociation. In essence: what the platform calls "engagement," the body registers as perceptual overload.

Digital FOMO: How the Internet Fuels Our Anxiety
In the age of endless feeds and disappearing stories, FOMO isn’t a feeling-it’s a design feature. Discover how platforms weaponise belonging, scarcity, and comparison to keep us addicted to the illusion of connection.

Then there's the infinite scroll - a design decision that erases the natural cues that normally tell us to pause, reflect, or reset. A 2021 study found that people underestimate time spent in infinite-scroll interfaces by as much as 50%. This distortion maps closely onto one of derealisation's core features: time flattening, the sense that continuity has thinned out or slipped away.

Design critic Kyle Chayka has a name for the feeling this produces: algorithmic anxiety - the subtle unease of navigating systems that are too fast, too personalised, too eerily attuned to our impulses. Our nervous systems, shaped over millennia for slow rhythms and predictable patterns, are suddenly forced into rapid-fire novelty-seeking punctuated by micro-moments of shock, humour, disgust, awe. The emotional palette becomes jump-cut.

When sensory input arrives faster than emotional processing can keep up, the mind does what it's evolved to do under strain: it protects itself. It dims, creating that familiar derealised fog - the sense of watching life rather than inhabiting it. And that's the uncomfortable truth sitting beneath our daily scroll: the feed inadvertently becomes a training ground for dissociation.

Not because people want to detach, but because the system rewards the very attentional fragmentation that it relies on. Feeds don't want deep, thoughtful exploration, they want you attention fragmented and flattened, so you can easily glide across 500 videos without much memory of what you've just seen.

The Aesthetic of Dissociation

On TikTok, derealisation isn't just discussed - it's performed. Scroll the tags #derealisation and #dpdr, and a distinctive visual grammar emerges: desaturated colour palettes, soft-focus filters, slowed-down ambient soundtracks, captions like "I don't feel real anymore" or "Everything looks like a dream."

What began as symptom-sharing has evolved into an aesthetic language of unreality. The "DP/DR aesthetic" trades in the visual codes of trance - blurred edges, looping motion, echoing reverb - reproducing both the experience and the emotion of detachment. For many, this is therapeutic shorthand; a way to translate an internal state into something visible.

But the same system that helps users find solidarity also risks deepening the sensation it portrays. TikTok's algorithm doesn't distinguish between confession and contagion; it amplifies whatever keeps people watching.

The Psychology of Halloween
Halloween isn’t just about sweets and screaming - it’s a collective therapy session. Exploring how confronting darkness through ritual, fear, and play helps us integrate the ghosts within.

According to MindSite News' investigation into TikTok's coverage of Dissociative Identity Disorder, the platform's recommendation system "actively surfaces content that reinforces identification with dissociative symptoms," creating micro-communities that blend awareness, validation, and self-pathologising.

The effect is a feedback loop of dissociation: content about derealisation begets more of it, and repeated exposure to those sensory patterns - hazy visuals, slowed audio, second-person narration - may mimic the perceptual qualities of the state itself. 

Within online subcultures, this phenomenon isn't framed as pathology but as community. On Reddit, users speak of "DP/DR cores" - personalised feeds built entirely around dissociation content - while others post scripts to "escape the void" by flooding their For You Page with comedy or "high-texture" videos. It's a form of self-directed exposure therapy, albeit one orchestrated by an algorithm rather than a clinician.

This is the paradox of algorithmic empathy: the same network that validates isolation can also aestheticise and recycle it. What begins as naming the pain can become a digital echo chamber of unreality - an ambient environment where feeling "not real" is both symptom and style.

The Human Cost

For many people, derealisation isn't a concept - it's a daily impairment. School becomes hard to navigate when the world looks "two-dimensional." Relationships strain when emotions feel muted. And beneath it all sits a quiet fear: What if this never ends?

What's shifting - both in clinics and online - is where people say they learned the language for these experiences. Increasingly, young people arrive naming symptoms they first encountered on TikTok. MindSite News' investigation showed how the recommendation system pushes dissociation-themed content toward users who linger even briefly, blurring education, identification, and misinformation.

DID and DP/DR are different conditions, but the mechanism matters: repeated exposure trains attention toward dissociative frameworks, and some viewers begin to interpret their own distress through that lens.

There's empirical backing for this. Research on socially transmitted nocebo effects (when a person's expectation of something has a worsening effect) shows that simply observing watching people talk about their experience can produce those same symptoms in the person viewing it. I wrote about mental health's contagious qualities a while back.

Is Mental Illness Contagious? 🧐
Here’s a puzzler for you. We know that when you take the train in the morning, or go to an office, or a gym, or a library, whatever yanks yer daily chain, there is a chance you might ‘pick up’ something from another person. That could be anything from

We've seen this before. During the Covid pandemic, something strange started appearing. "TikTok tics": functional tic-like behaviours that proliferated among teens who binge-watched tic content. Studies are starting to show that symptoms spread when vulnerable or suggestive people are constantly exposed to those symptoms, even from afar.

Now when you pair this understanding with other studies that show links between problematic smartphone use to anxiety, depression, sleep disruption, and dissociative experiences, a pattern starts to emerge. These devices don't directly cause these things, but they can do help create the conditions for them to emerge.

What Needs to Happen Next

If derealisation is being shaped - even partially - by the architecture of our digital environments, then treating it can't stop at grounding exercises and reassurance. The problem isn't only internal; it's ecological. And ecological problems require systemic responses.

Here's what needs to change.

1. Clinicians Need a New Intake Model

For decades, clinicians have asked about sleep, substances, trauma, panic, and medication.Very few ask: What does your feed look like? How fast is it? What themes does it push you toward?

We need digital-exposure screening that is as routine as screening for caffeine or alcohol:

  • What platforms are you using?
  • How often does the content skew toward dissociation, trauma, or unreality?
  • Do you feel different before versus after scrolling?
  • What is your time distortion like?

Not because the feed "causes" symptoms, but because it shapes how they appear, how people interpret them, and how long they persist.

2. Psychoeducation Must Evolve

Clients deserve to know what researchers already understand:

  • Symptoms can spread through social modelling.
  • Negative expectations can amplify distress via nocebo mechanisms.
  • Algorithmic feeds intensify whatever holds attention - including dissociation themes.

This knowledge isn't stigmatizing; it's stabilizing. It reframes derealisation from "I'm losing my mind" to "My nervous system is responding predictably to overload, expectation, and repetition."

TikTok and the Feeling of Unreality
Derealisation used to follow trauma. Now, therapists say it’s showing up after TikTok binges. The Disquiet investigates how algorithmic design is reshaping the texture of reality itself.

3. Platforms Must Measure What They Influence

None of the major platforms publicly track metrics for:

  • post-feed perceptual fatigue
  • time distortion
  • dissociative-like states
  • compulsive re-entry despite discomfort

They track everything except the experiences most relevant to mental health.

Regulators have begun nudging in this direction - the UK Online Safety Act, the EU Digital Services Act, KOSA in the US - but none explicitly address dissociation or perceptual harm. This is a blind spot big enough to drive the attention economy through.

Platforms should be required to:

  • conduct mental-health impact assessments (MHIAs)
  • release aggregate data on perceptual and emotional side effects
  • test new features for cognitive strain and dissociative risk
  • allow independent researchers access to anonymised behavioural data

This isn't punitive. It's the minimum standard for products that function as daily sensory environments.

4. Design Ethics Must Catch Up to Neuroscience

We can no longer pretend infinite scroll, autoplay, and variable reinforcement are neutral. These features pull on the same cognitive systems implicated in derealisation: attention, sensory integration, emotional salience, and time perception.

A design ethics fit for the nervous system would include:

  • Friction by default: natural stopping points, optional scroll limits.
  • Transparent recommender logic: clearer signals about why certain content clusters appear.
  • Slower sensory pacing for younger or vulnerable users.
  • User-controlled intensity settings - an accessibility feature we already accept in gaming and VR.

If a platform can adapt content to preference, it can adapt pacing to wellbeing.

5. Researchers Need Funding to Study the Actual Question

Right now, derealisation research is forced to borrow from adjacent fields - nocebo effects, functional neurological disorders, problematic smartphone use - because we lack direct studies on post-scroll dissociation.

We need:

  • lab studies measuring dissociation before and after short-form feed exposure
  • longitudinal studies tracking symptom development in high-exposure cohorts
  • qualitative interviews with young people who experience algorithm-linked unreality
  • cross-disciplinary collaborations between HCI, psychiatry, cognitive neuroscience, and design ethics

Without this, clinicians are forced to rely on pattern recognition and triangulation rather than evidence - an uncomfortable position for a field built on empiricism.

Influence for Sale: Inside the Creator Economy’s Mind Games
Unpack the dark psychology of the creator economy - where influencers weaponize trust, intimacy, and scarcity to drive sales and shape opinions among Gen Z audiences.

6. We Need a Shared Vocabulary

Both clinicians and clients are stuck between languages:

  • Clinical terms that move too slowly
  • Platform-born terms that are often metaphorical
  • Symptoms that sit somewhere between dissociation, overload, immersion, and expectation

When Reality Becomes a Moving Target

At some point we have to admit the obvious: reality itself is becoming harder to hold onto.

Not because people are weaker. Not because phones are evil. But because we've built a sensory environment that moves faster than the mind can metabolise - and then told people that the disorientation they feel is a personal failing rather than an ambient condition of modern life.

Derealisation used to be something that arrived after trauma or panic. Now it shows up quietly, after a scroll that went on too long, or a night spent sinking into streams built to collapse the boundaries between attention, emotion, and time. Young people describe stepping back into the physical world the way divers surface from deep water - blinking, unsteady, unsure which reality is the original.

And we have no diagnostic language for this. No shared map. No framework that can hold the strangeness of symptoms shaped partly by the nervous system and partly by the machinery of the feed. So we tell people to breathe. To ground. To rest. But what they really need is recognition: this isn't just you - the environment is shifting beneath all of us.

Care for Sale: When Capital Takes Over Care
Private equity is reshaping care - from children’s homes to mental health services. This piece exposes the moral cost of turning vulnerability into profit and asks: what happens when care is no longer care at all?

The task ahead isn't to moralise or catastrophise. It's simpler and harder: to slow down the places where reality frays, to ask better questions about the worlds we're building, and to create spaces sturdy enough for people to feel real again.

Because if platforms continue to rewrite perception at industrial speed, the work of mental health will increasingly become the work of reminding people that they are here, in bodies, in time, in the thick texture of the world - not just passing through someone else's algorithmic dream.

When I first started researching this topic, I felt it was an unusual corner of the world I explore every week: how technology, shapes, changes and affects us. But the longer I looked at it, the more I realised that derealisation isn't just a symptom anymore. It's a signal - telling us that something at the intersection of design and humanity has gone off balance.

And if we listen carefully, it's also an invitation: to rebuild the conditions that make reality feel like a place we can safely live.