AI Decision Dependence & Cognitive Caution 2025 (Dataset)
A de-identified open dataset of 201 adults, capturing how AI-assisted decision-making influences confidence, reliance, self-doubt, cognitive caution, and perceived influence of AI on personal judgement.
Measures include AI-decision reliance, confidence shifts when using AI tools, second-guessing behaviours, perceived influence and persuasion of AI suggestions, feelings of independence when deciding without AI, digital life exposure, clarity on what matters most, trust in AI systems, values participants believe AI should reflect, and demographic variables across six English-speaking countries.
Part of the Human Clarity Institute’s AI–Human Experience Data Series.
Framework
HRL domain(s): Agency & Decision Autonomy
Registry Construct Alignment: Agency, Decision dependence
Listed constructs reflect longitudinal, registry-mapped item alignment and do not represent the full thematic scope of this dataset.
DOI and Repository Links
This dataset is archived in GitHub, Zenodo, and Figshare for long-term preservation.
Citation
APA
Human Clarity Institute. (2025). AI Decision Dependence & Cognitive Caution 2025 (Dataset). Human Clarity Institute. https://doi.org/10.5281/zenodo.17772518
BibTeX
@dataset{hci_ai_decision_dependence_cognitive_caution_2025,
author = {Human Clarity Institute},
title = {AI Decision Dependence \& Cognitive Caution 2025 (Dataset)},
year = {2025},
doi = {10.5281/zenodo.17772518},
url = {https://humanclarityinstitute.com/datasets/ai-decision-dependence-cognitive-caution-2025/},
license = {CC-BY-4.0}
}
Licence
Creative Commons Attribution 4.0 International (CC BY 4.0)
You are free to share, adapt, and build upon this dataset for any purpose, including commercial use, provided appropriate credit is given to the Human Clarity Institute.
Full licence text: https://creativecommons.org/licenses/by/4.0/
Study Methodology
This dataset forms part of the Human Clarity Institute’s Human–AI Experience research programme, examining how people use AI in decision-making, how AI affects confidence and judgement, where independence still matters, and where caution about over-reliance begins to emerge. The study uses a cross-sectional online survey design and focuses on descriptive patterns in AI-assisted decision behaviour, perceived influence on thinking, decision confidence, autonomy, trust in AI systems, and cognitive caution in digitally mediated life.
Data were collected via the Prolific research platform from adults across six English-speaking countries. Participants provided explicit informed consent for anonymous open publication as part of HCI’s open research programme.
Sampling & participants
- Clean dataset: 201 valid responses
- Countries: United Kingdom, United States, Canada, Australia, New Zealand, Ireland
- Eligibility: Adults (18+) in English-speaking countries
- Recruitment platform: Prolific
- Anonymisation: Prolific IDs removed; timestamps stripped
Study limitations
- The survey uses a non-probability convenience sample and is not nationally representative.
- Results are based on self-reported responses and reflect perceived experiences of AI-assisted decision-making, confidence, independence, and over-reliance.
- The study uses a cross-sectional design, capturing responses at a single point in time.
- The dataset is descriptive and exploratory and does not support causal inference.
The Agency Gap
The Agency Gap explores how AI consultation may begin shaping human decision-making while people still experience responsibility and judgement as their own.
Why Can’t I Focus?
Why Can’t I Focus investigates how digital distraction fragments attention and how clarity about what matters most can help restore sustained focus.
Data use and reuse terms are outlined in our Data Use & Disclaimer.