You might use ChatGPT for anything from drafting emails to asking for life advice. But be aware that those personal chats lack legal protection.
OpenAI's CEO, Sam Altman, has confirmed that the company could be required to hand over your conversations in a lawsuit.
Speaking on the This Past Weekend podcast, Altman explained the precarious legal ground on which AI conversations stand.
"People talk about the most personal sh** in their lives to ChatGPT,” he said. He noted that many users, especially younger people, turn to the AI as a "therapist, a life coach," asking for guidance on relationship problems and other personal matters.
The issue, Altman points out, is the absence of legal privilege. When you talk to a doctor, lawyer, or therapist, those conversations are protected by confidentiality laws.
That same protection does not currently apply to your interactions with an AI. "If you talk to ChatGPT about your most sensitive stuff, and then there is a lawsuit, we could be required to produce that,” Altman stated.
This creates what Altman calls a "huge problem." You may think your chats are private, but they can be accessed through legal requests.
“I think that’s very screwed up,” Altman admitted. “I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago.”
The rapid adoption of AI for sensitive topics like financial and mental health advice makes this a pressing issue.
Altman says he has discussed the problem with policymakers who agree that the legal gap needs to be closed, but as of now, no laws are in place to fix it.
The lack of legal clarity is already affecting how people use AI. Podcast host Theo Von told Altman he hesitates to use ChatGPT extensively due to these privacy concerns. Altman acknowledged this is a reasonable stance, stating,
“I think it makes sense… to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity”.
Altman also warned that the growth of AI could lead to increased government surveillance. Governments might push for greater access to AI data to monitor for criminal activities like terrorism or fraud.
While acknowledging the need for some compromise on privacy for public safety, Altman expressed his nervousness about potential overreach.
“History is that the government takes that way too far, and I’m really nervous about that,” he said. He believes any compromise must be carefully balanced with user rights.
Until the law catches up with technology, the key takeaway is to be mindful of what you share. There is currently no legal guarantee that your private chats with an AI will remain private.
Source(s)
You might use ChatGPT for anything from drafting emails to asking for life advice. But be aware that those personal chats lack legal protection.
OpenAI's CEO, Sam Altman, has confirmed that the company could be required to hand over your conversations in a lawsuit.
Speaking on the This Past Weekend podcast, Altman explained the precarious legal ground on which AI conversations stand.
"People talk about the most personal sh** in their lives to ChatGPT,” he said. He noted that many users, especially younger people, turn to the AI as a "therapist, a life coach," asking for guidance on relationship problems and other personal matters.
The issue, Altman points out, is the absence of legal privilege. When you talk to a doctor, lawyer, or therapist, those conversations are protected by confidentiality laws.
That same protection does not currently apply to your interactions with an AI. "If you talk to ChatGPT about your most sensitive stuff, and then there is a lawsuit, we could be required to produce that,” Altman stated.
This creates what Altman calls a "huge problem". You may think your chats are private, but they can be accessed through legal requests.
“I think that’s very screwed up,” Altman admitted. “I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago”.
The rapid adoption of AI for sensitive topics like financial and mental health advice makes this a pressing issue.
Altman says he has discussed the problem with policymakers who agree that the legal gap needs to be closed, but as of now, no laws are in place to fix it.
The lack of legal clarity is already affecting how people use AI. Podcast host Theo Von told Altman he hesitates to use ChatGPT extensively due to these privacy concerns. Altman acknowledged this is a reasonable stance, stating,
“I think it makes sense… to really want the privacy clarity before you use [ChatGPT] a lot — like the legal clarity”.
Altman also warned that the growth of AI could lead to increased government surveillance. Governments might push for greater access to AI data to monitor for criminal activities like terrorism or fraud.
While acknowledging the need for some compromise on privacy for public safety, Altman expressed his nervousness about potential overreach.
“History is that the government takes that way too far, and I’m really nervous about that,” he said. He believes any compromise must be carefully balanced with user rights.
Until the law catches up with technology, the key takeaway is to be mindful of what you share. There is currently no legal guarantee that your private chats with an AI will remain private.