Jun 17, Kathmandu - With the rapid development of technology, the use of artificial intelligence (AI)-based technology is increasing day by day among people. Among them, ChatGPT, which is the most popular chatbot in the world, has started to influence and interact with our lifestyles in new ways. But experts have been warning that sometimes interacting with it can put our privacy at risk.
ChatGPT: AI but not completely secure
ChatGPT, developed by Open AI, is a type of generative language model. It automatically generates answers to user questions. It can provide accurate answers on a variety of topics, including general knowledge, articles, stories, songs, plans, legal advice, health tips, etc. However, it is not a rational person, nor a certified doctor or lawyer. Therefore, it is always necessary to be cautious when interacting with it.
What information should not be shared with Chat GP?
1. Personal details: Never write confidential details like your name, address, mobile number, passport number, citizenship and other identity card details, email address, date of birth etc. in a chatbot. Such private information shared in a chatbot can fall into the wrong hands and a person's identity can be misused.
2. Password or login details: When details like bank account passwords, email or Facebook logins, two-factor authentication codes are stored in AI systems, there is a possibility that they could be stored on the server, which could be fatal from a cybersecurity perspective.
3. Official or commercially confidential information: If you work for a private organization, bank, or government agency, you should not share documents such as company strategy, customer data, confidential reports, and future project plans with AI chatbots. Doing so risks commercial espionage or information leakage to the company.
4. Sensitive health information: AI can provide general health tips, but you should not rely on the advice it gives by sharing medical reports, psychological conditions, medication history, and detailed descriptions of mental health problems in ChatGPT. It is advisable to seek the advice of a real doctor or psychiatrist on such matters.
5. Legal disputes or sensitive matters: ChatGPT can provide general information about the legal process, but sharing sensitive information such as case details, court letters, legal advice, or files with it may pose a problem with your legal privacy and status.
6. Future plans and strategies: Personal, family or business plans (such as new company concepts, innovative projects and investment strategies, etc.) should also be kept confidential. Leaking such ideas on a public platform can lead to a loss of competitive advantage.
Things to pay special attention to when using the Internet
- Do not use ChatGPT on public Wi-Fi when using a smartwatch, phone, or laptop.
- Always have two-factor authentication.
- Parents should exercise necessary supervision when letting children use AI.
- If you receive a suspicious response from a chatbot, confirm it with a reliable source.
In fact, artificial intelligence, or AI, is a powerful technology, but its use should be limited and responsible. It can write articles, make calculations, and make suggestions. However, making the necessary decisions depends on the user's discretion. ChatGPT is an assistant, not an advisor. Therefore, you should never share confidential or sensitive information with it.