Nous ResearchHermes Agent
Deploy Now

Hermes Privacy Explained

·hermes agent privacyprivacysecuritydata

What data Hermes Agent sends to the cloud vs what stays on your machine. The full privacy picture for every supported LLM provider.

Want to try Hermes Agent yourself?

Try Hermes Free → Deploy in 60 seconds

Privacy is where self-hosted Hermes wins definitively. Here is exactly what data leaves your system, what stays, and how to achieve full offline operation.

Where Data Goes

With Hermes, data flows to:

  • Your chosen LLM provider (API calls)
  • Nowhere else (no Hermes servers, no telemetry)

The only external communication is API calls to your LLM provider. Everything else — memory, skills, sessions, gateway traffic — stays on your infrastructure.

What Stays Local

All of this lives on your machine or VPS:

  • MEMORY.md (persistent notes)
  • USER.md (your profile)
  • Session database (SQLite at ~/.hermes/state.db)
  • Skills you have created
  • Configuration
  • Conversations

You own every byte. Read, edit, delete — your choice.

Provider Privacy Comparison

Provider Trains on Your Data?
OpenAI Yes (unless opted out)
Anthropic Yes
DeepSeek Yes
Kimi/Moonshot Yes
MiniMax Unknown
Ollama (local) No — stays on device

Full Offline Setup

To achieve zero external data flow:

  1. Install Ollama on your machine
  2. Download a model: ollama pull qwen2.5:14b
  3. Configure Hermes to use local endpoint
  4. Never make API calls

Now the agent runs 100% locally. No data leaves.

Comparison to Cloud Agents

ChatGPT trains on your data by default. You can opt out, but still your prompts transit their servers.

Claude trains on your data. OpenClaw has your data.

Hermes: your data lives in ~/.hermes/, owned by you. Nothing is sent anywhere except API calls you authorize.

For Sensitive Work

  • Legal documents: Use full offline (Ollama)
  • Medical information: Full offline
  • Financial work: Full offline
  • Client work with NDA: Full offline

Security Features

Memory entries are scanned before storage:

  • Prompt injection patterns blocked
  • Credential exfiltration patterns blocked
  • SSH backdoor patterns blocked

Tirith security module adds protection for terminal commands.

Local LLM support for offline use

Ollama setup Compare to ChatGPT PrivateGPT comparison


FAQ

Can I verify no data leaves? Use a network monitor (e.g., Little Snitch on Mac, Wireshark on Linux) to observe traffic.

Does gateway add exposure? Telegram/Discord messages go through those platforms. Use Signal for encrypted messaging.

Run Hermes fully offline

VPS guide | Setup guide

flyhermes.ai

Frequently Asked Questions

Exactly what data leaves my machine when I use Hermes?

Only your LLM API calls send data externally — your conversation content plus a memory snapshot per request. Everything else stays local: MEMORY.md, USER.md, session database, skills, and configuration. Hermes has no telemetry and no servers of its own that receive your data.

Which LLM providers train on my conversation data?

OpenAI, Anthropic, DeepSeek, and Kimi all train on API data by default unless you've opted out in your account settings. MiniMax's policy is less clear. Ollama sends zero data anywhere — everything runs locally on your hardware. Always check each provider's current policy before use.

How do I achieve completely offline operation with Hermes?

Install Ollama, download a model (e.g., `ollama pull qwen2.5:14b`), configure Hermes to use the local endpoint at localhost:11434, and never make API calls. All agent functionality — memory, skills, terminal operations — works fully offline.

Can I verify that no unexpected data leaves my Hermes installation?

Yes. Use a network monitor like Little Snitch on macOS or Wireshark on Linux to observe outbound connections. Hermes should only connect to your configured LLM provider's API endpoints and any services you explicitly invoke (Telegram, Discord, etc.).

What does Hermes's Tirith security module protect against?

Tirith blocks prompt injection in memory writes, credential exfiltration patterns (prevents keys from being exported), SSH backdoor patterns, and obfuscated shell pipes. It's an additional guard beyond the LLM provider's own safety measures.

Ready to Run Your Own AI Agent?

Self-host Hermes in 60 seconds. No credit card, no cloud lock-in.

Deploy Hermes Free →

Related Posts