Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Mount Sinai nurses approve new contract ending strike at its NYC hospitals

February 11, 2026

In one of strongest endorsements from Trump admin, Dr. Oz says get measles vaccine

February 11, 2026

Study Reveals Processed Foods Cause Overeating and Slow Fat Loss

February 11, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » People use AI for companionship much less than we’re led to believe
AI

People use AI for companionship much less than we’re led to believe

IQ TIMES MEDIABy IQ TIMES MEDIAJune 26, 2025No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


The overabundance of attention paid to how people are turning to AI chatbots for emotional support, sometimes even striking up relationships, often leads one to think such behavior is commonplace.

A new report by Anthropic, which makes the popular AI chatbot Claude, reveals a different reality: In fact, people rarely seek out companionship from Claude and turn to the bot for emotional support and personal advice only 2.9% of the time.

“Companionship and roleplay combined comprise less than 0.5% of conversations,” the company highlighted in its report.

Anthropic says its study sought to unearth insights into the use of AI for “affective conversations,” which it defines as personal exchanges in which people talked to Claude for coaching, counseling, companionship, roleplay, or advice on relationships. Analyzing 4.5 million conversations that users had on the Claude Free and Pro tiers, the company said the vast majority of Claude usage is related to work or productivity, with people mostly using the chatbot for content creation.

Image Credits:Anthropic

That said, Anthropic found that people do use Claude more often for interpersonal advice, coaching, and counseling, with users most often asking for advice on improving mental health, personal and professional development, and studying communication and interpersonal skills.

However, the company notes that help-seeking conversations can sometimes turn into companionship-seeking in cases where the user is facing emotional or personal distress, such as existential dread or loneliness, or when they find it hard to make meaningful connections in their real life.

“We also noticed that in longer conversations, counseling or coaching conversations occasionally morph into companionship — despite that not being the original reason someone reached out,” Anthropic wrote, noting that extensive conversations (with over 50+ human messages) were not the norm.

Anthropic also highlighted other insights, like how Claude itself rarely resists users’ requests, except when its programming prevents it from broaching safety boundaries, like providing dangerous advice or supporting self-harm. Conversations also tend to become more positive over time when people seek coaching or advice from the bot, the company said.

The report is certainly interesting — it does a good job of reminding us yet again of just how much and how often AI tools are being used for purposes beyond work. Still, it’s important to remember that AI chatbots, across the board, are still very much a work in progress: They hallucinate, are known to readily provide wrong information or dangerous advice, and as Anthropic itself has acknowledged, may even resort to blackmail.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

Who will own your company’s AI layer? Glean’s CEO explains

February 11, 2026

Why the economics of orbital AI are so brutal

February 11, 2026

Threads’ new ‘Dear Algo’ AI feature lets you personalize your feed

February 11, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Yale suspends professor from teaching while reviewing his correspondence with Epstein

February 11, 2026

Gov. Gretchen Whitmer signs classroom smartphone ban for Michigan schools

February 11, 2026

Suspect in Canada school shooting is identified as 18-year-old

February 11, 2026

Gunman apprehended in southern Thailand after holding students and teachers hostage in school

February 11, 2026
Education

Yale suspends professor from teaching while reviewing his correspondence with Epstein

By IQ TIMES MEDIAFebruary 11, 20260

Yale University says a prominent computer science professor will not teach classes while it reviews…

Gov. Gretchen Whitmer signs classroom smartphone ban for Michigan schools

February 11, 2026

Suspect in Canada school shooting is identified as 18-year-old

February 11, 2026

Gunman apprehended in southern Thailand after holding students and teachers hostage in school

February 11, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.