LATEST NEWS:

ChatGPT refuses to provide information if you type five names

ChatGPT refuses to provide information if you type five names

ChatGPT is extremely useful when we want to get some information quickly, but it also has its weak points.

There are names that when you write in ChatGPT, they seem to "spoil" it, that is, you only get comments that this chatbot cannot give you an answer for them at the moment, Telegrafi reports.

ChatGPT ends the conversation as soon as you mention any of these names, and this is all the result of a special filter that “pulls the brakes” before the chatbot sends a response to the user.


OpenAI has never publicly commented on these cases, but most likely they are the names of people who have sued the company, or have asked to be "deleted forever" from the database.

These are the names ChatGPT ended the conversation after: Brian Hood, Jonathan Turley, Jonathan Zittrain, David Faber and Guido Scorza.

It all started with the first name on this list, when Australian Mayor Brian Hood threatened to sue ChatGPT in the middle of last year after discovering that the chatbot falsely claimed that Hood had been jailed for corruption, when in fact he was a whistleblower in one case.

As for the others on the list, there is everything, but the situation is more or less similar. It's mostly about people who sued ChatGPT because of the false information it spread about them.