Sam Altman, CEO of OpenAI, recently expressed his concerns about the potential impact of future AI models on the This Past Weekend podcast (episode no. 599, July 30, 2025). During the conversation with Theo Von, Altman spoke openly about the risks he believes could accompany the next version of ChatGPT. Altman stressed that the capabilities of AI systems are evolving rapidly and with it the risk of misuse.
The thing I lose the most sleep over is the misinformation problems with future models. Not this current one, but the one after this and the one after that. (Altman im Podcast "This Past Weekend")
ChatGPT 5.0 could take persuasion to a new level
According to Altman, the biggest concern isn't ChatGPT 4.0, but rather the next generation which is likely to be more powerful, persuasive and potentially manipulative. The OpenAI CEO is particularly concerned about the potential for political or social influence through deceptively real AI-generated content.
They’re going to be so good at persuasion, so good at deception, so good at... you know, just like, being able to kind of... manipulate people, if you want them to. (Altman im Podcast "This Past Weekend")
Altman warns that development is progressing so rapidly that the boundaries between truth and fiction are becoming increasingly blurred and difficult to distinguish. He then goes on to warn of a world in which "we no longer know what is real and what is not".
Civic responsibility and political support required
Altman sees the societal debate on generative AI as a key task for the coming years. Companies like OpenAI must develop technical safeguards, he argues, but also points out that it is difficult to effectively combat misuse without a clear legal framework and public debate.
Sam Altman's comments come against the backdrop of growing discussions about so-called 'frontier models': high-performance AI systems with far-reaching social impact. OpenAI is already working internally on GPT-5, although a release date can only be speculated at this point.
















