Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Oracle Lays Off Employees As It Curbs Costs During AI Buildout

March 31, 2026

Apple at 50: Former Leaders Recall Innovation and Iconic Products

March 31, 2026

Instagram Is Testing a Feature That Lets You Watch Stories Secretly

March 31, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist
AI

Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist

IQ TIMES MEDIABy IQ TIMES MEDIAJuly 25, 2025No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


ChatGPT users may want to think twice before turning to their AI app for therapy or other kinds of emotional support. According to OpenAI CEO Sam Altman, the AI industry hasn’t yet figured out how to protect user privacy when it comes to these more sensitive conversations, because there’s no doctor-patient confidentiality when your doc is an AI.

The exec made these comments on a recent episode of Theo Von’s podcast, This Past Weekend w/ Theo Von.

In response to a question about how AI works with today’s legal system, Altman said one of the problems of not yet having a legal or policy framework for AI is that there’s no legal confidentiality for users’ conversations.

“People talk about the most personal sh** in their lives to ChatGPT,” Altman said. “People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”

This could create a privacy concern for users in the case of a lawsuit, Altman added, because OpenAI would be legally required to produce those conversations today.

“I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago,” Altman said.

The company understands that the lack of privacy could be a blocker to broader user adoption. In addition to AI’s demand for so much online data during the training period, it’s being asked to produce data from users’ chats in some legal contexts. Already, OpenAI has been fighting a court order in its lawsuit with The New York Times, which would require it to save the chats of hundreds of millions of ChatGPT users globally, excluding those from ChatGPT Enterprise customers.

Techcrunch event

San Francisco
|
October 27-29, 2025

In a statement on its website, OpenAI said it’s appealing this order, which it called “an overreach.” If the court could override OpenAI’s own decisions around data privacy, it could open the company to further demand for legal discovery or law enforcement purposes. Today’s tech companies are regularly subpoenaed for user data in order to aid in criminal prosecutions. But in more recent years, there have been additional concerns about digital data as laws began limiting access to previously established freedoms, like a woman’s right to choose.

When the Supreme Court overturned Roe v. Wade, for example, customers began switching to more private period-tracking apps or to Apple Health, which encrypted their records.

Altman asked the podcast host about his own ChatGPT usage, as well, given that Von said he didn’t talk to the AI chatbot much due to his own privacy concerns.

“I think it makes sense … to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity,” Altman said.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

15% of Americans say they’d be willing to work for an AI boss, according to new poll

March 30, 2026

Popular AI gateway startup LiteLLM ditches controversial startup Delve

March 30, 2026

15% of Americans say they’d be willing to work for an AI boss

March 30, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

College instructor turns to typewriters to curb AI-written work

March 31, 2026

Clowns in Bolivia protest government decree limiting extracurricular activities

March 30, 2026

Chile vows tighter school security after weapons incidents

March 30, 2026

California school district must hire qualified teachers, court sets statewide precedent

March 30, 2026
Education

College instructor turns to typewriters to curb AI-written work

By IQ TIMES MEDIAMarch 31, 20260

The scene is right out of the 1950s with students pecking away at manual typewriters,…

Clowns in Bolivia protest government decree limiting extracurricular activities

March 30, 2026

Chile vows tighter school security after weapons incidents

March 30, 2026

California school district must hire qualified teachers, court sets statewide precedent

March 30, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.