At the Human Clarity Institute, we’re starting to see early signs of a growing tension: when everything online looks real, the line between truth and simulation blurs — reshaping how humans decide what (and who) to believe.
This is an emerging tension we expect to grow in the coming years. The internet has always required discernment, but the new wave of AI-generated media means even basic truth-testing can become emotionally tiring. People begin to scroll with half-belief and half-defence — a quiet shift from confident engagement to ambient scepticism.
These early patterns connect closely with what we’re already seeing in our research on focus and digital energy. When trust thins, attention scatters; when attention scatters, fatigue rises.
What our 2025 data shows
- Low trust in AI stewardship: ~50% say they do not at all trust big tech companies to use AI responsibly; a further 28% are only slightly trusting.
- Authenticity matters: 26% are very bothered and 18% are extremely bothered if content turns out to be AI-generated to keep them engaged or influence decisions.
- Uncertain detection: Only ~23% feel very or extremely confident telling human-made from AI content; ~35% feel just slightly or not at all confident.
Source: HCI Digital Life 2025 dataset (n = 1,003).
Why this matters
Trust isn’t collapsing in a single moment; it appears to be thinning across everyday interactions. People report feeling annoyed, cheated or deceived when they discover a post, review or video wasn’t human-made. That emotional sting, once occasional, may evolve into a background posture of doubt. As doubt becomes ambient, two downstream effects emerge: first, engagement cools — we invest less attention and empathy; second, verification shifts to the individual — everyone becomes their own fact-checker. Both are cognitively expensive, and both can erode focus and digital energy.
Future signals we’re watching
- Growth in AI voice scams and deepfake incidents driving “verification fatigue”.
- Rising demand for provenance markers and “verified human content”.
- Shifts in attention patterns as people become more cautious, selective and slower to trust.
We expect these signals to intensify over the next year as synthetic media becomes more seamless and more embedded in daily life.
Have you realised something online wasn’t real — and felt differently afterwards? We’re collecting early experiences that may inform future research. Share your story with HCI.

