OpenClaw Data Privacy: What Data Goes Where
Complete guide to OpenClaw data privacy in 2026. What data leaves your instance, what stays local, LLM provider data flows, and GDPR considerations.
Understanding OpenClaw's data flows is essential before you give your AI agent access to your email, calendar, and other sensitive services. This guide maps exactly what data goes where.
The Data Flow Architecture
OpenClaw is a local orchestration layer. Data flows:
[Your Data Sources] → [OpenClaw Instance] → [LLM API] → [Response] → [OpenClaw] → [You]
Your data sources (email, calendar, files) are accessed by OpenClaw running on your infrastructure. That data is then sent to your configured LLM provider's API for processing.
What Stays Local (Never Leaves Your Infrastructure)
- Your
openclaw.jsonconfiguration file - API keys (stored locally or in nacre.sh's encrypted vault)
- Conversation history (stored in your local OpenClaw database)
- Downloaded files and attachments
- Scheduled task configurations
- Skill settings and customizations
What Goes to Your LLM Provider
When OpenClaw processes a task, it constructs a prompt containing:
- Your agent's system instructions
- Relevant conversation context
- The specific content being processed (email text, calendar data, etc.)
- Tool call results
This goes to whichever LLM API you've configured (Anthropic, OpenAI, etc.). This is the same data exposure as using those services directly — but the key difference is you control what gets sent and when.
LLM Provider Data Policies
- Anthropic (Claude): Doesn't use API data to train models by default. Data retained for safety monitoring.
- OpenAI (GPT-4o): API data not used for training by default (with API key). Data retained per their API policy.
- Local models (Ollama): No data leaves your network. Complete privacy.
nacre.sh Data Privacy
nacre.sh acts as the hosting layer for your OpenClaw instance. nacre.sh sees:
- The traffic necessary to run your container
- Encrypted configuration files (keys are not accessible to nacre.sh)
- Usage metrics (for billing and capacity planning)
nacre.sh does NOT see the content of your agent's conversations or the data it processes.
GDPR Considerations
If you process EU personal data:
- Self-hosted: You are the data controller; document your lawful basis
- nacre.sh: nacre.sh provides a Data Processing Agreement (DPA) upon request
- LLM providers: Anthropic and OpenAI offer DPAs; necessary if processing EU personal data
Frequently Asked Questions
Does OpenAI/Anthropic read my emails when OpenClaw processes them?
The email content is included in the API prompt. Anthropic and OpenAI's API policies state they don't use API data for model training by default, but the data is transmitted to and temporarily processed by their infrastructure.
Can I use OpenClaw without sending any data to external APIs?
Yes — configure Ollama with a local model (like Hermes or Llama 3). Your data never leaves your network.
Is nacre.sh GDPR compliant?
nacre.sh is GDPR compliant and can provide a DPA for business customers. The hosting infrastructure is in EU regions upon request.
nacre.sh
Run OpenClaw without the server headaches
Dedicated instance, automatic TLS, nightly backups, and 290+ LLM integrations. Live in under 90 seconds from $12/month.
Deploy your agent →