ChatGPT 'therapy sessions' are not private, warns OpenAI CEO Sam Altman

ChatGPT 'therapy sessions' are not private, warns OpenAI CEO Sam Altman

Sam Altman emphasised that while millions are turning to AI tools for guidance, emotional support, or even informal therapy, the chats are not private and can potentially be adduced as evidence in court.

OpenAI CEO Sam Altman has warned users that conversations with ChatGPT, no matter how personal or sensitive, are not protected by legal confidentiality.

In an interview with podcaster Theo Von, Altman emphasised that while millions are turning to AI tools for guidance, emotional support, or even informal therapy, the chats are not private and can potentially be adduced as evidence in court.

“If someone has a deeply sensitive conversation with ChatGPT, and that data is requested in a legal case, we could be required to hand it over,” Altman explained.

Altman’s comments come as many users describe ChatGPT as a kind of digital confidant. Some say it has helped them process grief, divorce, business advice, navigate breakups, or manage anxiety.

The AI’s ability to mirror empathetic responses can give the illusion of a safe space, but legally, that space remains wide open.

“People are pouring their hearts out to a machine that could be legally obligated to reveal what they said,” said Dr Emily Shore, a technology and ethics professor at Stanford. “It’s a modern version of talking to a diary that might testify against you.”

It is already happening

The warning is not just theoretical; OpenAI is presently facing court orders in an ongoing legal battle, including a demand that the company retain and preserve user chats, even those marked as deleted.

While OpenAI typically deletes chats after 30 days for most users, that standard may be overridden by litigation.

And if a court issues a subpoena requesting user data, there is currently no legal barrier preventing those conversations from being disclosed.

In response, Altman is advocating for “AI privilege”, a legal framework granting confidential status to certain AI conversations, especially those involving mental health or sensitive personal matters.

“I think that conversations with AI should have a kind of privilege, just like talking to a doctor or lawyer,” Altman said. “We don’t have that yet, but we need to start working toward it.”

The idea is still in its early stages and would likely require new laws and government action.

But with AI becoming more embedded in education, therapy, and everyday decision-making, Altman argues that privacy needs to evolve with the technology.

ChatGPT might feel safe. It might even feel wise, understanding, or caring. But Sam Altman wants to be clear: it is not a person, and it does not come with human confidentiality.

Reader Comments

Trending

Popular Stories This Week

Stay ahead of the news! Click ‘Yes, Thanks’ to receive breaking stories and exclusive updates directly to your device. Be the first to know what’s happening.