With the popularity of artificial intelligences (IAS), more and more people turn to tools like chatgpt to assist in everyday tasks. Even some use chatbot as a kind of psychologist. However, it is important to know Information you should not share with IAS .
First, the AI database you are using is managed by a company. Furthermore, security systems cannot always guarantee that these data will remain without leaks .
Your personal information can be used by criminals for various purposes, from stealing your money to loans on your behalf. Therefore, the CNN gathered tips with seven information that you should not Share with IAS. Check it out:
- Personal information
Full name, date of birth, data from your documents and other personal information should never be shared with artificial intelligences or anonymous people on the internet. If your data leaks, malicious individuals can use it to perform fraud on their behalf.
Even phone numbers and email addresses should be avoided in IAS conversations. Although these chatbots Do not specifically store this information, your data can still be vulnerable to leaks.
- Logins and passwords
Do not use artificial intelligences to store social networking logins and passwords from any other site. This information should be memorized or stored in a safe digital environment, such as a password manager . Also, never share passwords with people you don’t trust.
- Work data
Depending on your business, company information must be fully confidential. Therefore, never enter data or internal work issues in the Chatgpt Or any other AI – confidential content should be kept only between you and your team.
Data related to intellectual property should also be kept confidential Because sharing them with chatbots can cause serious problems. Improper disclosure may result in legal penalties such as a fine or even prison.
In a known case that occurred in 2023, a Samsung employee accidentally leaked sensitive data from the company when using ChatgPT. As a result, he was fired.
- Bank information
Never send personal documents or any file with sensitive information for an AI, such as bank data, credit card numbers, passwords and other confidential data. Most digital crimes involve misuse of this information, so it is essential to prevent them from falling into the hands of criminals.

- Thoughts and Emotions
Nowadays, there are already developed IAS to act as “friends” or even as virtual psychologists. However, experts warn that this is reckless. Avoid sharing thoughts very personal or related to your emotions because IAS does not understand human feelings and can respond inappropriately.
In a recent case, a teenager took his own life after interpreting the advice of an AI as an incentive to suicide. If you are experiencing emotional difficulties, seek support with friends and family or contact the Life Valuation Center (CVV) at 188.
- Medical information
Just as ChatgPT is not a psychologist, none AI can replace a doctor. So avoid sending medical information to chatbots in an attempt to get answers about exams.
In addition, the General Data Protection Law (LGPD) ensures the protection of personal information of Brazilian citizens, including sensitive data, such as medical examinations. Therefore, attaching this type of document to a chatbot can cause a violation of the LGPD.
It is important to highlight that Only an accredited professional has the ability to make diagnoses . Although there are IAS developed exactly to assist in diagnoses, they should be operated by qualified doctors who know how to interpret the data correctly.
- Illegal or inappropriate activities
Among all items on the list, this is the most obvious. ChatgPT was not designed to answer questions related to illegal activities. That’s why, it will not help at all that is considered unethical or that it represents risk to other people’s safety .
Also, if the content inserted in the prompt is inappropriate, AI can store this information for review and this may result in complaints against you.
Alexa do Chatgpt? OpenAi launches a new virtual assistant feature
This content was originally published in chatgpt: 7 information you should not share with IAS on the CNN Brazil website.
Source: CNN Brasil

Charles Grill is a tech-savvy writer with over 3 years of experience in the field. He writes on a variety of technology-related topics and has a strong focus on the latest advancements in the industry. He is connected with several online news websites and is currently contributing to a technology-focused platform.