Sam Altman, CEO of OpenAI and the mind behind ChatGPT, has issued a warning to all users: be cautious with the information you share with the AI chatbot. The reason? According to Altman, conversations with ChatGPT could potentially be used as legal evidence, since there is currently no regulation protecting the content of these interactions — even those that may seem like private or “therapeutic” sessions.
Conversations with AI Are Not Protected Like Therapy Sessions
In a recent interview on the podcast This Past Weekend with Theo Von, Altman explained that, unlike discussions with a human therapist or psychologist, conversations with ChatGPT are not legally confidential. This means that anything users disclose to the chatbot could be requested and presented in legal proceedings if necessary.
“People talk about the most personal things in their lives with ChatGPT,” Altman said. “Young people especially use it as a life coach or therapist, but these conversations are not protected by any confidentiality laws — unlike those with a doctor, lawyer, or psychologist.”
A Legal and Ethical Grey Zone
Altman acknowledged that artificial intelligence tools like ChatGPT currently operate in a “grey area” when it comes to regulation and data privacy. While OpenAI has security protocols, there are no specific laws that prevent user data from being subpoenaed by a court if required.
“If you speak to ChatGPT about your most sensitive matters and then there’s a lawsuit or something similar, we could be forced to hand over that information. And I think that’s a disaster,” Altman remarked.
This revelation raises serious questions about the privacy and legal standing of interactions with AI systems. Despite their conversational style and seemingly confidential tone, these digital tools are not bound by professional secrecy like human therapists or attorneys.
Urgency for Legal Reform
Altman also stressed the urgency for new legislation to address these emerging issues. “Nobody had to think about this even a year ago. And now, I believe it’s a major problem. How are we going to deal with laws around this?” he asked.
His statements underscore a critical point: while AI technology advances rapidly and becomes more integrated into daily life, the legal framework around it is struggling to catch up. As more people use ChatGPT and similar tools to seek guidance, emotional support, or even to vent, the lack of protections leaves them vulnerable.
Sam Altman’s comments serve as a wake-up call for users and policymakers alike. While ChatGPT may feel like a safe and private space to open up, the truth is more complicated. Until specific regulations are enacted, users should treat their conversations with AI as public, not private. In an era where data is power, discretion remains a user’s best defense.