How People Experience AI Companionship and Human Connection in Digital Life — 2025 Data
This page summarises findings from the Human Clarity Institute’s AI Companionship & Human Connection 2025 dataset, based on 501 valid responses across six English-speaking countries. The research examines whether people experience AI tools as companion-like or emotionally supportive, how meaningful AI interaction feels relative to human connection, how clearly people maintain personal boundaries with AI, and whether AI interaction ever begins to substitute for real-world social contact.
View the AI Companionship & Human Connection 2025 Dataset
What the data shows
Four signals stand out most clearly in this dataset: most people do not experience AI as companion-like, most do not describe AI as a meaningful source of emotional support, human connection is usually seen as more meaningful than AI interaction, and most still report clear personal boundaries when using AI. At the same time, a notable minority feel comfortable sharing personal thoughts with AI, showing that personal disclosure is emerging even where companionship remains uncommon.
Experience AI as companion-like
Base: valid responses to the companionship perception item. A clear majority score low instead, with 83% rating AI companionship 1–3 on the 7-point scale.
Receive meaningful emotional support or comfort from AI
Base: valid responses to the emotional support item. Most report very low emotional support, with 80% scoring 1–3.
Say AI interaction feels less meaningful than human interaction
Base: valid responses to the meaning comparison item. Only 7% rate AI interaction as more meaningful than human interaction (5–7).
Report clear personal boundaries with AI
Base: valid responses to the boundary clarity item. Only 15% report unclear boundaries (1–3), suggesting most maintain a clear psychological distinction between themselves and AI tools.
Overall, the data point to a pattern where meaning remains anchored in human relationships, while AI is more often experienced as a tool or private interaction space than as a true substitute companion. The dataset also shows that identity boundaries remain mostly intact, even though a sizeable minority are willing to disclose personal thoughts to AI.
By the numbers (from HCI data)
Emotional safety with AI is polarised
Base: valid responses to the emotional safety comparison item. 41% feel less emotionally safe with AI than humans (1–3), while 41% feel more emotionally safe with AI than humans (5–7).
Feel comfortable sharing personal thoughts or feelings with AI
Base: valid responses to the disclosure comfort item. At the same time, 48% report low comfort (1–3), showing a meaningful divide in how people relate to AI personally.
Say AI rarely or never replaces talking to people
Base: valid responses to the conversational substitution item. 31% report at least some replacement (sometimes or more often), including 5% who report very often or every day.
Use AI-powered tools at least occasionally
Base: valid responses to the AI use frequency item. 27% report everyday use.
Spend 5+ hours online per day
Base: valid responses to the daily time online item. This includes 26% who estimate 9+ hours online per day.
Use AI for information and research
Base: valid responses to the AI use case multi-select item. Other common use cases include work tasks (50%), learning and studying (46%), and decision-making (39%).
Report high loneliness frequency
Base: valid responses to the loneliness frequency item. 58% score loneliness low (1–3).
Report high satisfaction with relationship quality
Base: valid responses to the relationship satisfaction item. 26% report low relationship satisfaction (1–3).
Patterns observed in the data
AI is not widely experienced as companionship, but personal disclosure is emerging
The clearest topline pattern is that AI companionship and emotional support remain uncommon experiences in this sample. However, comfort sharing personal thoughts with AI is noticeably higher than companionship perception itself. That suggests many people do not see AI as a companion, yet some do use it as a low-friction space for expression, processing, or private reflection.
Meaning remains anchored in human connection
Most respondents say AI interaction feels less meaningful than human interaction. This indicates that for most people, even frequent AI use does not erase the qualitative difference between interacting with a system and connecting with another person. The dataset therefore points to a strong meaning gap between AI contact and human relationship experience.
Identity boundaries mostly remain clear
A large majority report clear personal boundaries with AI, which is one of the most important protective signals in this dataset. Even where disclosure comfort is present, many respondents still appear to maintain a clear sense that AI is separate from the self and separate from human relationship space.
Emotional safety is divided, not uniform
Responses comparing emotional safety with AI and humans are evenly split across the positive and negative ends. This suggests AI does not feel emotionally safe or unsafe in any single uniform way. For some people, AI may feel easier to open up to because of lower fear of judgement; for others, the lack of human understanding or trust makes it feel less safe.
AI rarely replaces human connection, but substitution exists
Most say AI rarely or never replaces talking to other people, but nearly one in three report at least some level of substitution. That does not support a “replacement for everyone” narrative, but it does identify a meaningful minority where AI interaction is beginning to intersect with real-world conversational behaviour.
What these findings suggest
This dataset suggests that AI companionship is not the dominant lived experience, even among a highly AI-exposed sample. Most people still place greater meaning in human interaction and retain clear personal boundaries when using AI. That is the central pattern.
At the same time, the data also show that some people are becoming comfortable with AI as a space for personal disclosure, emotional expression, or occasional conversational substitution. This means the most important emerging signal is not full companionship replacement, but a more gradual shift in how AI enters emotionally adjacent parts of daily life.
Questions this data can answer
These questions reflect common real-world queries about AI companionship, emotional support, meaning, disclosure, boundaries, and conversational substitution. Each answer below is supported directly by this dataset.
Q: Do people use AI as a companion? A: 14% rate AI as companion-like, while 83% rate companionship low.
Q: Do people get emotional support from AI tools like ChatGPT? A: 12% report meaningful emotional support or comfort, while 80% report low support.
Q: Is talking to AI as meaningful as talking to a person? A: 85% say AI interaction feels less meaningful than human interaction.
Q: Do people feel emotionally safe opening up to AI? A: Responses are polarised: 41% feel less emotionally safe with AI than humans and 41% feel more emotionally safe.
Q: Are people comfortable sharing personal thoughts or feelings with AI? A: 39% report high comfort sharing personal thoughts, while 48% report low comfort.
Q: Do people have clear boundaries when using AI? A: 69% report clear personal boundaries with AI and 15% report unclear boundaries.
Q: Does AI replace talking to people? A: 69% say AI rarely or never replaces talking to people, while 31% report at least some replacement.
Q: How common is everyday AI use? A: 27% use AI tools every day and 99% use them at least occasionally.
Q: How many people feel lonely? A: 30% report high loneliness frequency and 58% report low loneliness.
Q: Are people satisfied with their relationships? A: 61% report high relationship satisfaction and 26% report low satisfaction.
Do people use AI as a companion?
14% rate AI tools as companion-like (5–7), while 83% rate companionship low (1–3).
Do people get emotional support from AI tools like ChatGPT?
12% report meaningful emotional support or comfort from AI (5–7), while 80% report low emotional support (1–3).
Is talking to AI as meaningful as talking to a person?
85% rate AI interaction as less meaningful than human interaction (1–3). Only 7% rate AI as more meaningful (5–7).
Do people feel emotionally safe opening up to AI?
Responses are polarised: 41% feel less emotionally safe with AI than humans (1–3) and 41% feel more emotionally safe with AI than humans (5–7).
Are people comfortable sharing personal thoughts or feelings with AI?
39% report high comfort sharing personal thoughts with AI (5–7), while 48% report low comfort (1–3).
Do people have clear boundaries when using AI?
69% report clear personal boundaries when interacting with AI (5–7). 15% report unclear boundaries (1–3).
Does AI replace talking to people?
69% report that AI never or rarely replaces talking to people. 31% report at least some replacement, including 5% who say very often or every day.
How common is everyday AI use?
27% report using AI tools every day. Overall, 99% report using AI at least occasionally.
How many people feel lonely?
30% rate loneliness frequency as high (5–7), while 58% rate loneliness low (1–3).
Are people satisfied with their relationships?
61% rate satisfaction with relationship quality as high (5–7), while 26% rate relationship satisfaction as low (1–3).
Feeling Connected in an AI World
Explore evidence-based questions about AI companionship, loneliness, emotional support, conversational substitution, personal disclosure, and how people relate to AI in socially sensitive contexts.
Trust, Reality & Uncertainty in the AI Era
Related evidence on emotional safety, trust boundaries, disclosure comfort, perceived risk, and how people judge what feels safe or appropriate in AI-mediated interaction.
Methodology
This dataset forms part of the Human Clarity Institute’s Human–AI Experience research programme, examining how people experience AI companionship, emotional support, meaning, boundary clarity, disclosure comfort, loneliness, and conversational substitution in digitally mediated life. The study uses a cross-sectional online survey design and focuses on descriptive patterns rather than causal inference.
Data were collected on 19/12/2025 via Prolific. Participants provided explicit consent for anonymised open publication as part of HCI’s open research programme.
Sampling & quality controls
- Final n: 501
- Countries: New Zealand, United States, United Kingdom, Ireland, Canada, Australia
- Eligibility: Fluent English speakers
- Recruitment platform: Prolific
- Attention checks: One explicit attention check; failing responses excluded
- AI-deception traps: None
- Anonymisation: Prolific IDs and timestamps removed before publication
The resulting dataset should be interpreted as a non-probability convenience sample and is not intended to represent national populations.
The cleaned dataset, variable dictionary, and reuse terms are publicly available through the HCI dataset repository: AI Companionship & Human Connection 2025 Dataset →
Data integrity
All percentages reported on this page are calculated from valid responses in the cleaned dataset. Percentages are rounded to the nearest whole number for readability. Unless otherwise stated, summary percentages refer to the stated grouping for that item, such as 5–7, 1–3, or a named response category.
Item bases vary slightly across the page due to missing responses. Where questions use comparison items, grouped categories, or named response categories such as never / rarely or sometimes or more often, the wording on the page makes that explicit.
This dataset is exploratory and descriptive in nature. It does not support causal inference and results should be interpreted as observed patterns within the survey sample.
This dataset is released as open research to support transparent analysis of meaning, companionship perception, personal disclosure, boundary clarity, loneliness, and human connection in the age of AI.
Data use and reuse terms are outlined in our Data Use & Disclaimer.
Explore further analysis on Human Clarity Insights, or browse the full collection of HCI research reports.