5 Things People Regularly Share With ChatGPT That Unknowingly Jeopardizes Their Job & Safety, According To An Expert

We are entirely too trusting AI bots, and it's putting us at direct risk.

Written on Jun 17, 2025

Man sharing with ChatGPT jeopardizing his job and safety Robalito | Shutterstock
Advertisement

We're all relying more and more on AI tools like ChatGPT to help us get things done, even if we don't necessarily realize it (Siri and Alexa, anyone?). But while they're a wonder at helping us streamline tasks, a security expert says we're not thinking critically enough about what exactly we're sharing with these tools, and especially about what they might be able to do with our information down the road.

Advertisement

5 things people regularly share with ChatGPT that put their jobs and safety at risk:

We tend to think of using tools like Claude or ChatGPT as a sort of digital assistant as fairly innocuous (aside from the fact that chatbots are literally sending people down conspiratorial rabbit holes and even into full-blown psychosis, that is). But, experts at digital security firm Indusface say that we're being far too presumptuous about the privacy of what we're entering into those chat prompts.

And the statistics on the matter bear this out. Studies and analyses have found that 38% of regular users have admitted to sharing sensitive work data with AI tools without their employer's permission, including things like customer information and sensitive legal or financial data.

Advertisement

Not only are AI tools not, you know, humans you can ask to keep a secret, but they're also hackable. And accordingly, data breaches of this kind of information reportedly increased by 60.4% just between February and April of 2023, which coincides with a period of exponential user growth of ChatGPT following its November 30, 2022, launch.

Hanging in the balance are not just sensitive details from our jobs, but our own personal details that are often included in the data along with them. Many find it hard to care about their data privacy anymore, since it's basically non-existent, but these breaches also present a risk of being fired from your job if you're found to be at fault for causing these data breaches.

Here are 5 things to avoid sharing with AI tools at work to avoid these situations, according to Indusface.

Advertisement

RELATED: People Are Developing Delusions & 'Psychosis' From Using ChatGPT As A Spiritual Guide

1. Work files, such as reports and presentations

These are, of course, riddled with sensitive data about both your company and you yourself. Studies have found that as much as 80% of Fortune 500 employees use tools like ChatGPT to help with things like emails, reports, and presentations. All of which are frequently riddled with sensitive data and often elements that are strictly confidential.

So, security experts recommend removing anything sensitive before inputting these kinds of files into tools like ChatGPT, as large language models, or LLMs, hold onto everything they receive indefinitely and might share your information with other users if prompted.

2. Passwords and access credentials

We've all had it drilled into our heads for decades now that you should never, ever share your password to anything with anyone. But people regularly share them with LLMs while using them to help with tasks, and AI features are embedded in many password management tools.

Advertisement

"It’s important to remember that [LLMs] are not designed with confidentiality in Mind," Indusface cautioned. "Rather, the purpose is to learn from what users input, the questions they ask, and the information they provide." Proceed with caution.

RELATED: Wife Ends 12-Year Marriage Because ChatGPT Analyzed Her Husband’s Coffee Grounds & Told Her He Cheated

3. Personal details like your full name and address

woman sharing personal info like her name and address with ChatGPT Jopwell | Pexels

Advertisement

Security experts say that while sharing this kind of info is second nature if you're using AI tools as a sort of assistant, doing so throws the doors open to making yourself a victim of fraud. This includes photos of you, including those with other people, by the way.

Indusface said that if it's a piece of information that fraudsters could use to either impersonate you or create deepfakes of you or your associates, it needs to stay off of ChatGPT. These situations could ruin not just your finances but your reputation. And in the latter case, it could also impact your employer's reputation, which Indusface says puts both your employer and you personally in danger of legal action.

4. Financial information

While LLMs like ChatGPT can be very useful in breaking down complex financial topics or even doing financial analysis, security experts say they should never be used for decision-making. This is because, apart from security concerns, LLMs are designed primarily for word-processing, so their numerical literacy can be lacking. 

Use them for financial analysis and you're likely to get inaccurate responses, leading to potentially catastrophic mistakes that could cost you your job.

Advertisement

5. Company code or other intellectual property.

Tech companies are notorious for having incredibly exploitative intellectual property rights written right into their terms of service. Every word of that novel or love note you're writing in that online word-processing platform is open season for being used to train AI LLMs on, for example.

And in a business environment, this means that whatever sensitive company secrets or IP is within the information you're sharing with AI tools is ripe for the picking, too. Using ChatGPT to help with coding is an increasingly popular use, for example, but Indusface said this means that code is in danger of being "stored, processed, or even used to train future AI models, potentially exposing trade secrets to external entities." Once again, this could potentially cost you your job.

The bottom line is that while these tools feel like tech miracles, they need to be used with abundant precautions, because their creators' entire business models are literally dependent on you not taking any.

Advertisement

RELATED: Burned-Out Employee Packs Up & Leaves Work Because ChatGPT Told Her To — ‘For Some Reason It Made Everything Click’

John Sundholm is a writer, editor, and video personality with 20 years of experience in media and entertainment. He covers culture, mental health, and human interest topics.

Loading...