Is OpenAI HIPAA Compliant? What Healthcare Teams Need to Know
Business Associate Agreements
OpenAI supports HIPAA regulatory compliance through Business Associate Agreements (BAAs) for eligible offerings. With a signed BAA and the right technical controls, you can use OpenAI to process Protected Health Information (PHI) in compliance with HIPAA. Without a BAA, you must not send PHI to OpenAI services.
What a BAA with OpenAI covers
- API usage on Zero Data Retention (ZDR)–eligible endpoints only; endpoints that persist application state are out of scope.
- ChatGPT for Healthcare (and certain sales-managed enterprise offerings) can include a BAA; ChatGPT Business is not eligible.
- The BAA governs OpenAI’s role as a business associate and the data processing safeguards it provides; you remain responsible for end-to-end HIPAA compliance.
How to request a BAA
- Contact OpenAI (e.g., via your account team or the designated BAA intake) with your use case, data flows, and security posture.
- Be prepared to detail which endpoints you will use, whether any third-party tools are involved, and how you’ll enforce PHI security controls.
- Approvals are case-by-case; most healthcare teams are approved when they align with ZDR-eligible workflows and appropriate safeguards.
Zero Data Retention API Endpoints
Zero Data Retention excludes customer content from abuse-monitoring logs and enforces behavior like treating the store parameter as false, helping ensure PHI is not retained by the platform. ZDR is enabled at the organization or project level after OpenAI approval and applies only to supported capabilities.
ZDR‑eligible endpoints (examples)
/v1/chat/completionsand/v1/responses(with limitations noted below)/v1/embeddingsand/v1/moderations- Audio endpoints:
/v1/audio/transcriptions,/v1/audio/translations,/v1/audio/speech - Image generation on supported models (for example, current GPT‑image models) with noted constraints
/v1/realtimeand legacy/v1/completions
Endpoints not eligible for ZDR (avoid for PHI)
- Assistants and conversation stores:
/v1/assistants,/v1/threads,/v1/conversations, related “items” or “vector_stores” objects - File/object stores and long-lived resources:
/v1/files,/v1/batches, and/v1/videos
Important limitations and caveats
- Background mode and extended prompt caching can store application state and are not compatible with strict ZDR.
- Some features of ZDR‑eligible endpoints may temporarily store state (for example, certain audio output flows); design accordingly.
- Web Search may be ZDR‑eligible but is not HIPAA‑eligible and is not covered by a BAA; do not process PHI with it.
- Images and files may be scanned for safety; in rare cases flagged content can be retained for manual review.
Operational tips
- Explicitly set and enforce
store=falsewhere applicable; restrict projects to ZDR and allow only approved endpoints. - Create an allowlist of models and endpoints for clinical AI solutions; block all non‑approved features at the gateway.
- Document your retention stance and ensure downstream logs, traces, and caches never include PHI.
ChatGPT for Healthcare
ChatGPT for Healthcare is an enterprise offering designed for regulated use, combining clinical AI solutions with PHI security controls. Content shared in this workspace is not used to train foundation models, and you can request a BAA to support HIPAA‑compliant use.
Ready to simplify HIPAA compliance?
Join thousands of organizations that trust Accountable to manage their compliance needs.
Enterprise security and governance
- Role‑based access control (RBAC), SAML SSO, SCIM provisioning, and detailed audit logs
- Data residency options and customer‑managed encryption keys for stronger data processing safeguards
- Configurable data retention controls and no training on your data
Using ChatGPT for Healthcare responsibly
- Scope the tool to approved workflows (e.g., summarization, drafting, clinical search); clinicians retain final decision‑making.
- Avoid unapproved third‑party connectors for PHI; ensure any integrations are vetted and contractually covered.
- Disable or restrict non‑HIPAA‑eligible features (e.g., web search) in clinical workspaces.
Configuring OpenAI Services for Compliance
API configuration checklist
- Request approval for Zero Data Retention and set the project’s retention mode accordingly.
- Use only ZDR‑eligible endpoints; block assistants, threads, files, and videos for PHI.
- Disable background mode and extended prompt caching for PHI workloads.
- Enforce network egress controls; never transmit PHI to unapproved tools or connectors.
- Encrypt PHI in transit and at rest; prefer customer‑managed keys where supported.
- Centralize audit logging; avoid storing prompts/responses with PHI in developer logs.
ChatGPT configuration checklist
- Provision users via SSO/SCIM; apply least‑privilege roles to clinical and administrative groups.
- Enable audit logs; review access regularly and revoke stale accounts promptly.
- Restrict or disable features not covered by your BAA (e.g., web search, unvetted apps).
- Turn off cross‑workspace data sharing; keep PHI confined to the healthcare workspace.
Programmatic PHI security controls
- Implement input filtering and redaction to enforce the minimum necessary standard.
- Use data loss prevention (DLP), request-size limits, and structured prompts to avoid over‑sharing PHI.
- Continuously test prompts and outputs for leakage risks; add guardrails for sensitive attributes.
Approval Process for BAAs
To obtain a BAA, contact OpenAI with your intended use cases and technical architecture. OpenAI will confirm that your workflow uses ZDR‑eligible endpoints (or an eligible enterprise workspace), assess data processing safeguards, and review third‑party exposures. You’ll receive next‑step guidance, and once executed, the BAA will govern PHI handling alongside your configuration obligations.
HIPAA Compliance Requirements
HIPAA compliance spans administrative, physical, and technical safeguards. A BAA is necessary but not sufficient; you must put controls around how PHI is collected, processed, stored, and disclosed.
Administrative safeguards
- Risk analysis and management; policies for PHI lifecycle and the minimum necessary rule
- Workforce training, sanction policies, and vendor management with clear BAAs
- Incident response and breach notification playbooks tested against realistic scenarios
Technical safeguards
- Unique user identification, RBAC, and SSO; multi‑factor authentication for privileged roles
- Encryption in transit and at rest; customer‑managed keys where available
- Comprehensive audit logs, immutable retention for forensic needs, and egress restrictions
Privacy and disclosure controls
- Apply de‑identification when possible; strip direct identifiers before prompts
- Prohibit PHI in non‑HIPAA‑eligible features and endpoints; monitor and block violations
- Regularly validate outputs for accuracy and appropriateness; clinicians make final decisions
Limitations of Standard API Endpoints
- Do not send PHI to assistants, threads, conversations, vector stores, files, batches, or videos; they persist application state and are not ZDR‑eligible.
- Background mode and extended prompt caching store intermediate data; exclude them from PHI flows.
- Web Search is not HIPAA‑eligible and not covered by a BAA; disable it for clinical work.
- Hosted code execution features are not compatible with ZDR; avoid them for PHI or use a non‑PHI project.
- Third‑party connectors and external tools fall outside OpenAI’s BAA unless separately contracted; treat them as out of scope for PHI.
Conclusion
OpenAI can be used in a HIPAA‑compliant manner when you execute a Business Associate Agreement, restrict usage to Zero Data Retention–eligible capabilities, and enforce robust PHI security controls. ChatGPT for Healthcare streamlines these needs with enterprise governance and data processing safeguards designed for regulated environments.
Design your workflows for the minimum necessary standard, disable non‑eligible features, and continuously monitor usage. With the right architecture and controls, clinical teams can safely realize the benefits of modern AI while protecting patient privacy.
FAQs
How can healthcare organizations obtain a BAA with OpenAI?
Engage your OpenAI representative (or the designated BAA intake) with your use case, architecture, and endpoint list. OpenAI reviews requests case‑by‑case, confirms ZDR‑eligible usage, and then executes a BAA that governs PHI processing. Prepare documentation on data flows, third‑party tools, and your PHI security controls to accelerate approval.
What OpenAI services are covered by HIPAA compliance?
Coverage depends on having a signed BAA and using eligible services. For the API, only Zero Data Retention–eligible endpoints are in scope. For ChatGPT, ChatGPT for Healthcare (and certain sales‑managed enterprise offerings) can include a BAA; ChatGPT Business is not eligible. Features like web search and endpoints that persist application state are not HIPAA‑eligible.
How does zero data retention protect PHI?
ZDR prevents your customer content from being stored in abuse‑monitoring logs and forces behaviors (such as treating store as false) that avoid retaining prompts and responses. This reduces the risk of inadvertent PHI storage within the service. You still must avoid modes that persist application state and ensure downstream systems never log PHI.
What configurations are necessary for HIPAA compliance?
Execute a BAA, enable ZDR at the org or project level, restrict usage to ZDR‑eligible endpoints, block non‑eligible features (e.g., web search, hosted code execution), and enforce RBAC, SSO/SCIM, audit logs, encryption, and egress controls. Apply the minimum necessary standard, redact identifiers where possible, and continuously test workflows to verify PHI security controls.
Ready to simplify HIPAA compliance?
Join thousands of organizations that trust Accountable to manage their compliance needs.