How People Experience AI Companionship and Human Connection in Digital Life — 2025 Data

This page summarises findings from the Human Clarity Institute’s AI Companionship & Human Connection 2025 dataset, based on 501 valid responses across six English-speaking countries. The research examines whether people experience AI tools as companion-like or emotionally supportive, how meaningful AI interaction feels relative to human connection, how clearly people maintain personal boundaries with AI, and whether AI interaction ever begins to substitute for real-world social contact.

View the AI Companionship & Human Connection 2025 Dataset

Construct tags: Meaning Coherence · Identity Stability · Behavioural Alignment

What the data shows

Four signals stand out most clearly in this dataset: most people do not experience AI as companion-like, most do not describe AI as a meaningful source of emotional support, human connection is usually seen as more meaningful than AI interaction, and most still report clear personal boundaries when using AI. At the same time, a notable minority feel comfortable sharing personal thoughts with AI, showing that personal disclosure is emerging even where companionship remains uncommon.

How percentages are defined on this page: unless otherwise stated, summary figures on this page combine respondents who selected 5–7 on the 7-point scale for the named experience or judgement. Where a figure refers to a low score grouping (1–3), a comparison item, or a named response category, that is stated explicitly in the wording. Item bases vary slightly due to missing responses.
14%

Experience AI as companion-like

Base: valid responses to the companionship perception item. A clear majority score low instead, with 83% rating AI companionship 1–3 on the 7-point scale.

12%

Receive meaningful emotional support or comfort from AI

Base: valid responses to the emotional support item. Most report very low emotional support, with 80% scoring 1–3.

85%

Say AI interaction feels less meaningful than human interaction

Base: valid responses to the meaning comparison item. Only 7% rate AI interaction as more meaningful than human interaction (5–7).

69%

Report clear personal boundaries with AI

Base: valid responses to the boundary clarity item. Only 15% report unclear boundaries (1–3), suggesting most maintain a clear psychological distinction between themselves and AI tools.

Overall, the data point to a pattern where meaning remains anchored in human relationships, while AI is more often experienced as a tool or private interaction space than as a true substitute companion. The dataset also shows that identity boundaries remain mostly intact, even though a sizeable minority are willing to disclose personal thoughts to AI.

By the numbers (from HCI data)

41% / 41%

Emotional safety with AI is polarised

Base: valid responses to the emotional safety comparison item. 41% feel less emotionally safe with AI than humans (1–3), while 41% feel more emotionally safe with AI than humans (5–7).

39%

Feel comfortable sharing personal thoughts or feelings with AI

Base: valid responses to the disclosure comfort item. At the same time, 48% report low comfort (1–3), showing a meaningful divide in how people relate to AI personally.

69%

Say AI rarely or never replaces talking to people

Base: valid responses to the conversational substitution item. 31% report at least some replacement (sometimes or more often), including 5% who report very often or every day.

99%

Use AI-powered tools at least occasionally

Base: valid responses to the AI use frequency item. 27% report everyday use.

68%

Spend 5+ hours online per day

Base: valid responses to the daily time online item. This includes 26% who estimate 9+ hours online per day.

82%

Use AI for information and research

Base: valid responses to the AI use case multi-select item. Other common use cases include work tasks (50%), learning and studying (46%), and decision-making (39%).

30%

Report high loneliness frequency

Base: valid responses to the loneliness frequency item. 58% score loneliness low (1–3).

61%

Report high satisfaction with relationship quality

Base: valid responses to the relationship satisfaction item. 26% report low relationship satisfaction (1–3).

Patterns observed in the data

AI is not widely experienced as companionship, but personal disclosure is emerging

The clearest topline pattern is that AI companionship and emotional support remain uncommon experiences in this sample. However, comfort sharing personal thoughts with AI is noticeably higher than companionship perception itself. That suggests many people do not see AI as a companion, yet some do use it as a low-friction space for expression, processing, or private reflection.

Meaning remains anchored in human connection

Most respondents say AI interaction feels less meaningful than human interaction. This indicates that for most people, even frequent AI use does not erase the qualitative difference between interacting with a system and connecting with another person. The dataset therefore points to a strong meaning gap between AI contact and human relationship experience.

Identity boundaries mostly remain clear

A large majority report clear personal boundaries with AI, which is one of the most important protective signals in this dataset. Even where disclosure comfort is present, many respondents still appear to maintain a clear sense that AI is separate from the self and separate from human relationship space.

Emotional safety is divided, not uniform

Responses comparing emotional safety with AI and humans are evenly split across the positive and negative ends. This suggests AI does not feel emotionally safe or unsafe in any single uniform way. For some people, AI may feel easier to open up to because of lower fear of judgement; for others, the lack of human understanding or trust makes it feel less safe.

AI rarely replaces human connection, but substitution exists

Most say AI rarely or never replaces talking to other people, but nearly one in three report at least some level of substitution. That does not support a “replacement for everyone” narrative, but it does identify a meaningful minority where AI interaction is beginning to intersect with real-world conversational behaviour.

What these findings suggest

This dataset suggests that AI companionship is not the dominant lived experience, even among a highly AI-exposed sample. Most people still place greater meaning in human interaction and retain clear personal boundaries when using AI. That is the central pattern.

At the same time, the data also show that some people are becoming comfortable with AI as a space for personal disclosure, emotional expression, or occasional conversational substitution. This means the most important emerging signal is not full companionship replacement, but a more gradual shift in how AI enters emotionally adjacent parts of daily life.

Methodology

This dataset forms part of the Human Clarity Institute’s Human–AI Experience research programme, examining how people experience AI companionship, emotional support, meaning, boundary clarity, disclosure comfort, loneliness, and conversational substitution in digitally mediated life. The study uses a cross-sectional online survey design and focuses on descriptive patterns rather than causal inference.

Data were collected on 19/12/2025 via Prolific. Participants provided explicit consent for anonymised open publication as part of HCI’s open research programme.

Sampling & quality controls

  • Final n: 501
  • Countries: New Zealand, United States, United Kingdom, Ireland, Canada, Australia
  • Eligibility: Fluent English speakers
  • Recruitment platform: Prolific
  • Attention checks: One explicit attention check; failing responses excluded
  • AI-deception traps: None
  • Anonymisation: Prolific IDs and timestamps removed before publication

The resulting dataset should be interpreted as a non-probability convenience sample and is not intended to represent national populations.

The cleaned dataset, variable dictionary, and reuse terms are publicly available through the HCI dataset repository: AI Companionship & Human Connection 2025 Dataset →

Data integrity

All percentages reported on this page are calculated from valid responses in the cleaned dataset. Percentages are rounded to the nearest whole number for readability. Unless otherwise stated, summary percentages refer to the stated grouping for that item, such as 5–7, 1–3, or a named response category.

Item bases vary slightly across the page due to missing responses. Where questions use comparison items, grouped categories, or named response categories such as never / rarely or sometimes or more often, the wording on the page makes that explicit.

This dataset is exploratory and descriptive in nature. It does not support causal inference and results should be interpreted as observed patterns within the survey sample.

This dataset is released as open research to support transparent analysis of meaning, companionship perception, personal disclosure, boundary clarity, loneliness, and human connection in the age of AI.

Data use and reuse terms are outlined in our Data Use & Disclaimer.

Explore further analysis on Human Clarity Insights, or browse the full collection of HCI research reports.