Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission.Learn more.

OpenAI fears humans will become ’emotionally reliant’ on ChatGPT’s human voice

In This Article

In This Article

OpenAI, the maker of ChatGPT, has revealed concerns users may develop emotional dependency on the chatbot’sforthcoming voice mode.

The ChatGPT-4o mode is currently being analysed for safety ahead of a rollout to the community. It enables, to a certain extent, users to converse naturally with the assistant as if it were a real person.

With that comes the risk of emotional reliance, and “increasingly miscalibrated trust” of an AI model that would be exacerbated by interactions with an uncannily human-like voice. A voice that can also take account of the user’s emotions through tone of voice.

Save big on the PlayStation VR 2 with this Amazon deal

The PS VR 2 has plummeted to just £423.50 on Amazon. Save £97.49 on the 4K gaming headset when you shop today. That’s 18% off the 2023 headset’s £529.99 RRP.

Thefindings of the safety review(viaWired), published this week expressed concerns about language that reflected a sense of shared bonds between the human and the AI.

“While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time,” the review reads. It also says the dependence on the AI might affect relationships with other humans.

“Human-like socialization with an AI model may produce externalities impacting human-to-human interactions. For instance, users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships. Extended interaction with the model might influence social norms. For example, our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions,” the document adds.

Furthermore, the review pointed out the possibility of over-reliance and dependence.

“The ability to complete tasks for the user, while also storing and ‘remembering’ key details and using those in the conversation, creates both a compelling product experience and the potential for over-reliance and dependence.”

The team said there’ll be further study on the potential for emotional reliance on the voice-based version of ChatGPT. The feature drew mainstream attention earlier this summer due to the voice’s startling resemblance to the actress Scarlett Johansson. The Hollywood star, who actually played an AI being its user fell in love with in the film Her, refused the offer to voice OpenAI’s assistant.

However, the end result ended up sounding suspiciously like her anyway, despite CEO Sam Altman’s insistance the voice wasn’t cloned.

You might like…

You might like…

Chris Smith is a freelance technology journalist for a host of UK tech publications, including Trusted Reviews. He’s based in South Florida, USA.  …

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.