GetAIPolicy Teamยท2026-03-14ยท4 min read

What Employees Should NEVER Put Into ChatGPT: A Simple Guide

Your employee just pasted your entire client list into ChatGPT. Another one uploaded a confidential contract to get a summary. A third copied and pasted the company's Q4 financial results to "help with a presentation."

This is happening in your company right now. And without clear guidelines, every employee is making their own judgment call about what's safe to share with AI tools.

Here's a simple guide you can share with your team today.

The Red/Yellow/Green System

Think of data like traffic lights. Every piece of information in your company falls into one of three zones.

๐Ÿ”ด RED ZONE โ€” NEVER Enter Into Any AI Tool

This data should never, under any circumstances, be entered into ChatGPT, Gemini, Copilot, Claude, or any other AI tool:

  • Customer PII: Names, email addresses, phone numbers, physical addresses, Social Security numbers, dates of birth
  • Passwords and credentials: API keys, access tokens, database connection strings, encryption keys
  • Financial data: Credit card numbers, bank account details, specific revenue figures, unreleased financial results
  • Health/medical data: Patient records, diagnoses, treatment plans, prescription information (HIPAA-protected)
  • Legal documents: Active contracts, litigation details, attorney-client communications, settlement terms
  • Source code with secrets: Code containing hardcoded API keys, internal URLs, or proprietary algorithms
  • Employee personal data: Salary information, performance reviews with names, disciplinary records
Why? Once data enters an AI tool, you lose control over it. Most AI providers store your inputs (typically for 30 days), may use them for model training (unless you've opted out), and could be compelled to disclose them in legal proceedings.

๐ŸŸก YELLOW ZONE โ€” Only With Approval and Anonymization

This data can be used with AI tools, but only after getting manager/IT approval and removing identifying details:

  • Internal strategy documents: Remove client names, specific dates, and financial figures before summarizing
  • Draft contracts and proposals: Anonymize all parties and specific terms
  • Anonymized customer feedback: Aggregate trends are fine; specific customer quotes with attribution are not
  • Salary ranges and compensation data: General ranges are okay; specific employee compensation is not
  • Internal communications: Remove names and identifying context
The rule: If you can't tell which specific person, client, or deal it refers to, it's probably safe.

๐ŸŸข GREEN ZONE โ€” Safe to Use

This data is generally safe to enter into AI tools:

  • Publicly available information: Anything already on your website, in press releases, or in public filings
  • General knowledge questions: "What are best practices for onboarding?" or "Summarize the key points of GDPR"
  • Brainstorming and ideation: "Give me 10 headline ideas for a blog post about remote work"
  • Code assistance with non-proprietary code: Generic functions, open-source library usage, standard algorithms
  • Writing assistance: Grammar checking, tone adjustments, and rewording of non-sensitive text
  • Research summaries: Summarizing publicly available reports, articles, and studies

The 5-Question Decision Tree

Before entering anything into an AI tool, ask yourself:

1. Does this contain anyone's personal information? โ†’ Yes? RED ZONE. Stop. 2. Would this be a problem if a competitor saw it? โ†’ Yes? RED ZONE. Stop. 3. Does this contain non-public financial data? โ†’ Yes? RED ZONE. Stop. 4. Is this internal but non-sensitive? โ†’ Yes? YELLOW ZONE. Anonymize first, get approval. 5. Is this already public or generic knowledge? โ†’ Yes? GREEN ZONE. Proceed.

Real Scenarios Your Team Will Face

ScenarioZoneWhy
"Summarize this client's feedback email"๐Ÿ”ด RedContains client name and specific details
"Help me debug this API endpoint" (with hardcoded keys)๐Ÿ”ด RedContains credentials
"Rewrite this internal memo about Q2 goals"๐ŸŸก YellowInternal strategy โ€” anonymize first
"Give me 5 subject lines for our newsletter"๐ŸŸข GreenGeneric creative task
"Check my code for syntax errors" (generic code)๐ŸŸข GreenNo proprietary logic or secrets
"Summarize the main points of GDPR Article 17"๐ŸŸข GreenPublic legal text

The Solution: A Data Classification Guide

Every employee in your company should have a clear, one-page data classification guide that tells them exactly what falls into each zone โ€” customized to your specific industry and data types.

Healthcare companies need to explicitly call out HIPAA-protected data. Financial services firms need to address SOX and PCI-DSS. Marketing agencies need client confidentiality rules. Tech companies need source code and IP guidelines.

A generic "don't share sensitive data" policy isn't enough. Your team needs specific, actionable rules they can reference in 10 seconds.

GetAIPolicy generates a customized Data Classification Quick Reference Guide tailored to your industry and the specific data types your company handles. It's one of five documents in our Pro and Team governance packs. Get a customized data classification guide for your company โ†’
This article is for informational purposes only and does not constitute legal advice.

Ready to create your AI policy?

Generate a customized AI governance policy for your business in 10 minutes.

Generate My AI Policy โ†’