Chatbots may seem like reliable intelligent assistants, but experts are warning against getting too personal with AI-powered agents.
Recent survey data from the Cleveland Clinic shows that one in five Americans have sought health advice from AI, while survey statistics released last year by Tebra found that roughly 25% of Americans are more likely to use a chatbot over sessions of therapy.
Experts, however, are warning users against oversharing with AI chatbots, especially when it comes to medical information.
According to USA Today, people should avoid disclosing medical and health data to AI, which does not comply with the Health Insurance Portability and Accountability Act (HIPAA).
Since chatbots like ChatGPT are not HIPAA compliant, they should not be used in a clinical setting to summarize patient notes, nor should they have access to sensitive data.
That being said, if you’re looking for a quick response, be sure to remove your name or other identifying information that could potentially be exploited, USA Today reported.
The media also warned that explicit content and illegal advice are off limits, such as uploading information about other people.
“Remember: anything you type into a chatbot can be used against you,” Stan Kaminsky, from cyber security company Kaspersky, previously told The Sun.
Login credentials, financial information, answers to security questions and your name, number and address should also never be shared with AI chatbots. This sensitive data can be used against you by malicious actors
“No passwords, passport or bank card numbers, addresses, phone numbers, names or other personal data belonging to you, your company or your customers should end up in conversations with an AI,” Kaminsky continued.
“You can replace these with an asterisk or “REDACTED” in your request.”
Confidential information about your company is also a big privacy scam,
“There can be a strong temptation to upload a working paper to get, say, an executive summary,” Kaminsky said.
“However, by carelessly uploading a multi-page document, you risk revealing confidential data, intellectual property, or a trade secret, such as the release date of a new product or the entire team’s payroll.”
#medical #advice #chatbots
Image Source : nypost.com