
Using ChatGPT for everyday tasks has become the norm for many people. Whether they use the AI app as a way to build a weekly grocery list, or they turn to their ChatGPT as a sort of companion for endless advice, some people have found that it’s easy to confide in ChatGPT, even if the responses come from something that is far from human. So when OpenAI CEO Sam Altman shared in a podcast that chat logs could be used against users at some point, it was a bit of a shock.
When the Global Eye News outlet shared as much on X (formerly Twitter), users were shook. Some commented on the thread with screenshots of them urging their own ChatGPT companions not to share any private information. Others asked how long chat logs are saved. OpenAI, which owns ChatGPT, apparently lacks anything that prevents law enforcement from accessing certain chat logs, should it be deemed legally necessary.
@aljazeeraenglish OpenAI CEO Sam Altman says some ChatGPT users share very personal information with the chatbot and warns that legally, those conversations aren’t protected in case of a lawsuit. #news ♬ original sound – Al Jazeera English
ChatGPT logs don’t just go away.
According to PC Mag, Altman shared details about what ChatGPT can and cannot share outside of chats when he appeared on the podcast This Past Weekend w/ Theo Von. While Altman agreed that it is “very messed up” that legal cases that require access to chat logs could be granted access to otherwise seemingly private information, OpenAI currently allows that.
“If you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, like we could be required to produce that,” Altman said on the podcast. He added, “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.”
The hope, according to Altman, is that they can figure out a way to extend that same confidentiality to ChatGPT. There is no denying that some people use the AI buddy as a sort of therapist from time to time. Some users ask ChatGPT far more intimate questions than they might ask Google about, and for some users, that’s the pull of ChatGPT as a whole.
ChatGPT users aren’t sure it’s a good thing that their information is stored.
@famcast33 #samaltman #chatgpt #fyp #greenscreen ♬ original sound – famcast_3
On the Global Eye News thread on X, users shared questions about this alarming fact about ChatGPT. One user commented, “So asking ChatGPT the best place to hide the body is a bad idea?” It was a joke, of course, but it led others to wonder just how much is actually deleted if they manually delete chats themselves. The idea of that information sticking around is a little worrying.
“So you’re telling me I trauma dumped to an undercover cop?” Another ChatGPT user commented on the thread.
Someone else more seriously wrote: “That’s actually terrifying. People probably tell ChatGPT way more personal stuff than they’d ever put in writing anywhere else, thinking it’s just a conversation with an AI. But if all that data can be subpoenaed and used against you in court? That’s a huge privacy nightmare.”