AI Cognitive Load, Mental Fatigue and Decision Offloading 2025
This page summarises findings from the Human Clarity Institute’s Cognitive Load, Fatigue & Decision Offloading 2025 dataset, examining how people experience mental effort, fatigue, and decision pressure when using AI and digital systems.
The dataset includes responses from 503 adults. In this sample, 82% report using AI-powered tools or features in their work or daily life.
View the Cognitive Load, Fatigue & Decision Offloading 2025 Dataset
Construct tags: Cognitive Load · Decision Dependence · Agency
What the data shows
Short breaks help clear the mind
Stepping away from AI or digital input helps restore mental clarity.
Checking AI accuracy drains focus
Judging whether AI information is accurate or trustworthy creates mental strain and consumes attention.
Thinking feels clearer after a day without AI
Extended time away from AI or digital systems restores clarity and reduces cognitive load.
Worry about becoming dependent on AI
Concern about dependence appears alongside sustained cognitive load and ongoing interaction effort.
Decisions can feel less personally connected when AI is involved
Under cognitive load, decision support can shift how decisions are experienced, making them feel less directly owned.
These findings show a consistent pattern: cognitive load is driven primarily by interaction effort, especially verification and prompt management, while recovery signals are strong. As this load builds, it begins to shape how decisions are experienced and how people relate to AI-supported choices.
What these signals mean
Verification effort is a major source of cognitive load
A large share report that checking AI accuracy drains focus. This suggests cognitive load is often created by monitoring and validating AI output, not only by producing outputs with AI.
Recovery signals are strong and observable
Many participants report clearer thinking after breaks, and after a full day without AI. This supports a recovery loop where boundaries and disconnection restore clarity.
Cognitive load can create pressure toward offloading
Concern about dependence appears alongside sustained interaction effort. This suggests that cognitive strain may contribute to a gradual shift toward relying on AI support, even when that reliance is not fully intentional.
Decision experience shifts under load
When cognitive load increases, decisions may feel less personally connected. This indicates that decision support can influence not only outcomes, but also how decisions are experienced.
Patterns associated with this experience
One pattern examined in this dataset is the relationship between cognitive verification effort and the mental load of guiding AI systems.
Across the full sample (n = 503):
- 43% say that checking whether AI information is accurate or trustworthy drains their focus.
- 32% say that guiding or re-phrasing prompts for AI tools feels mentally taxing.
When the two signals are analysed together, a clear concentration effect appears.
Prompt fatigue among those experiencing verification strain
Cognitive strain clusters among those managing both verification effort and prompt interaction.
Prompt fatigue among those without verification strain
Mental effort is significantly lower when verification strain is not present.
This means respondents who experience cognitive strain when verifying AI outputs are roughly three times more likely to also report mental effort when guiding AI prompts.
Rather than appearing evenly across the population, cognitive load signals cluster among individuals who experience both forms of interaction effort — verification work and prompt management — suggesting a subgroup experiencing compounded cognitive friction in AI-mediated tasks.
Key signals supporting Cognitive Load
Persistent after-effects are not the dominant pattern for the full sample: 79% report no difficulty mentally switching off, and 82% do not report feeling uneasy or restless without AI access.
Questions this data can answer
Why does using AI sometimes feel mentally tiring?
Cognitive load is often driven by verification effort and prompt management, with many reporting that checking accuracy and guiding AI both require sustained mental effort.
Is the mental effort coming from writing prompts or checking AI answers?
Both contribute. Verification effort and prompt construction create overlapping sources of cognitive strain.
Do people think more clearly when they step away from AI?
Recovery signals show that breaks and time away from AI can restore clarity and reduce mental load.
Is it normal to worry about becoming dependent on AI?
Concern about dependence is common and appears alongside ongoing cognitive effort in AI-mediated environments.
Why do decisions sometimes feel less like my own when I use AI?
Under cognitive load, decision support can shift how decisions are experienced, making them feel less personally connected.
Why Can’t I Focus?
Why Can’t I Focus investigates how digital distraction fragments attention and how clarity about what matters most can help restore sustained focus.
Digital Fatigue and Energy
Digital Fatigue and Energy examines how digital life can deplete energy, intensify mental strain, and shape people’s sense of cognitive capacity.
Methodology
This dataset forms part of the Human Clarity Institute’s Human–AI Experience research programme, examining how AI tools and digital systems shape cognitive load, mental fatigue, and decision offloading in everyday use. The study uses a cross-sectional online survey design and focuses on descriptive patterns in how people experience AI-mediated thinking environments.
Data were collected on 18 November 2025 via the Prolific research platform from adults across English-speaking countries. Participants provided explicit informed consent for anonymised data publication as part of HCI’s open research programme.
Sampling & participants
- Clean dataset: 503 valid responses
- Countries: United Kingdom, United States, Australia, New Zealand, Ireland
- Eligibility: Fluent English
- Recruitment platform: Prolific
Participants were recruited using platform screening filters. The resulting dataset should be interpreted as a non-probability convenience sample and is not intended to represent national populations.
The cleaned dataset, variable dictionary, and reuse terms are publicly available through the HCI dataset repository: Cognitive Load, Fatigue & Decision Offloading 2025 Dataset →
Data integrity
All percentages reported on this page are calculated from valid responses in the cleaned dataset (n = 503). Percentages are rounded to the nearest whole number for readability. Unless otherwise stated, summary percentages combine respondents selecting 5–7 on the 7-point agreement scale (slightly agree, moderately agree, or strongly agree).
Where wording refers to disagreement or non-agreement, percentages are calculated from the corresponding lower end of the same response scale. Where figures describe subgroup patterns, the denominator changes to the named subgroup rather than the full sample.
In the “Patterns associated with this experience” section, the comparison figures are calculated within two defined groups: respondents who report verification-related focus drain and respondents who do not. These figures describe overlap between two survey items and should be interpreted as descriptive co-occurrence within the sample.
This dataset is exploratory and descriptive in nature. It does not support causal inference and results should be interpreted as observed patterns within the survey sample.
This dataset is released as open research to support transparent analysis of cognitive load and decision experience in AI-enabled environments.
Suggested citation:
Human Clarity Institute. (2025). Cognitive Load, Fatigue & Decision Offloading 2025 (Dataset). Human Clarity Institute.
DOI: https://doi.org/10.5281/zenodo.17636370
Data use and reuse terms are outlined in our Data Use & Disclaimer.
Explore further analysis on Human Clarity Insights, or browse the full collection of HCI research reports.