LATEST NEWS:

You use ChatGPT, seven things you should never reveal to them

You use ChatGPT, seven things you should never reveal to them

Large language models, or artificial intelligence tools, have become almost an indispensable part of life.

Even if you don't actively use them every day, you're most likely using some service that uses an artificial intelligence model in the background, like ChatGPT, Gemini, Perplexity, and the like, reports Telegraph.

However, even though many rely on these services to help them with their daily business (and private) obligations, they are also unaware that any use of these services poses a certain security threat.


That is, humans, in their desire to get a job done as quickly as possible, often reveal sensitive and private information to chatbots and other artificial intelligence models that should not be revealed to other humans. But chatbots are not people, right?

In fact, in this context, they should be considered persons, because just as you wouldn't reveal information about yourself or the company you work for to just anyone on the street, you shouldn't easily feed it into AI model chatbots.

Although the companies that develop these models most often do not use data from the user's conversations with the model (although, for example, DeepSeek trains its artificial intelligence model this way) to further train the model, this does not mean that this data does not remain recorded in your user account.

All it takes is for someone to sit down at your computer, open your favorite artificial intelligence model, and read your chat history, and thus sensitive data.

But what sensitive information should you not share?

Personal data

No matter how much you trust AI models, for your own safety, you should never share your full name, address, social security number, ID card number, passport number, and similar identification documents in a conversation with an AI chatbot. The same goes for photos of such identification documents.

If you're using a chatbot to create a resume, ask it to design a form for you that you can fill out on your computer, rather than inputting personal information into an artificial intelligence model.

Financial information

Just as there is constant warning against sharing financial information such as credit or debit card numbers, bank account details, or passwords for online or mobile banking, the same rule applies to sharing this data with artificial intelligence models.

Perhaps you would like to get advice from an artificial intelligence model about the loan that would best suit you or how to save on taxes, but in this case, you would prefer to use a “imaginary scenario” rather than personal data.

Passwords

Sharing passwords to access user accounts or accounts within your company is extremely irresponsible. Just as you wouldn't give your apartment keys and address to a stranger on the street, don't reveal your passwords and usernames to an artificial intelligence model.

Sensitive or confidential information

You shouldn't treat an AI model like a confidant or best friend and reveal sensitive or secret information to it. Private data, like who's cheating on whom, or what a coworker did at work, is not information an AI model should know. The same goes for private information about someone's health or family members.

Information about the company you work for or collaborate with

It may be tempting to feed ChatGPT your information and ask it to draft a contract between the two companies, but that's not a smart move. The same goes for any document that contains confidential information (information about prototypes, protected patents, etc.), as well as information about meetings, business plans, and company finances. This is precisely why some companies block access to ChatGPT and other artificial intelligence models on official computers.

Explicit content

Even if you jokingly asked ChatGPT for information on “how to get rid of someone forever,” it could get you into trouble. Not only will ChatGPT not answer such a question, but some artificial intelligence models flag such suspicious questions and report them to the police. So if you're tempted to ask it for advice, it's best to hold back.

Medical information

Have you heard of the term “Dr. Google”? Well, many users have replaced Google with ChatGPT and treat it like a personal doctor. There’s nothing wrong with asking ChatGPT for clarification on a finding, but avoid specific identifying information in your question. Please generalize your question. For example, “recommend exercises for a man in his 30s who has sciatica.” /Telegraph/