Notebookcheck Logo

OpenAI expresses concern about users developing feelings for its chatbot

OpenAI's early tests show some concerning results (Image source: OpenAI)
OpenAI's early tests show some concerning results (Image source: OpenAI)
The recently introduced GPT-4o for ChatGPT has been praised for its human-like interactions. However, this has caused OpenAI to be concerned about the users forming emotional connections with it. The company says it will now monitor and adjust the chatbot to prevent people from developing feelings for it.

GPT-4o was introduced as a vastly improved model for ChatGPT. Since its debut, people have praised ChatGPT-4o for its human-like interactions. While this sounds great, OpenAI has noticed a problem: people are starting to treat the chatbot as a human being and form emotional bonds.

OpenAI has observed people using language that "might indicate forming connections." The company notes that it has found instances where people used language that expressed "shared bonds." This is described as bad news for two main reasons.

First of all, when ChatGPT-4o appears to be human-like, users can disregard any of the hallucinations coming out of the chatbot. For context, AI hallucination is basically the incorrect or misleading outputs generated by the model. This can happen due to flawed or insufficient training data.

Secondly, human-like interactions with the chatbot could reduce real social interactions among the users. OpenAI says that the chatbot interactions could potentially be beneficial for "lonely individuals," but they could also affect healthy relationships. The company further notes that people can even begin to talk with humans with the perception that the other person is a chatbot.

That would be bad as OpenAI has designed GPT-4o to stop talking when the user starts to talk over it. With all these concerns, the company says that it will now monitor how the users develop emotional bonds with ChatGPT-4o. OpenAI also states that it will make adjustments to the model where necessary.

Get the PLAUD Note AI-powered voice recorder from Amazon

Source(s)

Read all 5 comments / answer
static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
Mail Logo
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2024 08 > OpenAI expresses concern about users developing feelings for its chatbot
Abid Ahsan Shanto, 2024-08-11 (Update: 2024-08-15)