ChatGPT’s refusal to recognize “David Mayer” was due to a mistake, says OpenAI | ChatGPT

ChatGPT’s refusal to recognize “David Mayer” was due to a mistake, says OpenAI | ChatGPT

Last weekend the name was everywhere on the internet – except on ChatGPT.

David Mayer became momentarily famous on social media because the popular chatbot apparently wanted nothing to do with him.

Legions of chatbot wranglers have spent days trying – and failing – to get ChatGPT to write the words “David Mayer.” But the chatbot refused to comply, and the answers varied between “something seems to have gone wrong” and “I can’t answer” or simply stopped at “David”.

This sparked a flurry of online speculation about Mayer’s identity. It also led to theories that David Mayer, whoever he is, had asked to have his name removed from ChatGPT’s output.

ChatGPT developer OpenAI has provided some clarity to the situation by explaining that the Mayer issue was due to a system error. “One of our tools incorrectly flagged this name and prevented it from appearing in the replies, which should not have happened. We are working on a solution,” said an OpenAI spokesperson

Some speculators on social media suspected that the man at the center of the affair was David Mayer de Rothschild, but he told the Guardian it had nothing to do with him, citing the conspiracy theories that have been surging online around his family’s name could.

“No, I did not ask for my name to be removed. I have never had contact with Chat GPT. “Unfortunately everything is driven by conspiracy theories,” he told the Guardian.

It is also understood the error had nothing to do with the late academic Prof David Mayer, who was apparently placed on a US security list because his name matched the pseudonym of a Chechen militant, Akhmed Chatayev.

However, the answer may lie closer to GDPR data protection regulations in the UK and EU. OpenAI’s European privacy policy makes it clear that users can delete their personal data from their products. This process is also known as the “right to be forgotten” and means someone removes personal information from the internet.

OpenAI declined to comment on whether the “Mayer” error was related to a right-to-be-forgotten lawsuit.

OpenAI has fixed the “David Mayer” issue and is now responding to queries with that name, although other names that appeared on social media over the weekend still trigger a “Something seems to have gone wrong” response when typed into ChatGPT .

Helena Brown, partner and data protection specialist at law firm Addleshaw Goddard, said “right to be forgotten” requests would apply to any organization or individual that processes that person’s data – from the AI ​​tool itself to any organization that uses that tool .

“In the context of the David Mayer question, it is interesting to see that an entire name can be removed from the entire AI tool,” she said.

However, Brown added that completely removing any information that allows a specific person to be identified could be more difficult for AI tools to remove.

“The sheer volume of data generated by GenAI and the complexity of the tools present a privacy compliance issue,” she said, adding that deleting all information about a single person is not as simple as removing their name.

“For the development of AI models and their results, large amounts of personal data are collected, including from public sources such as the Internet. This means that it may be virtually impossible to track down and delete all personal information that can be used to identify a single person.”

Leave a Reply

Your email address will not be published. Required fields are marked *