lodis
← Back to home

Setup Guide

Get Lodis running in your AI tools in under a minute.

1. MCP Client Setup

Add the Lodis MCP server to your client’s config file. Same JSON for every client — just change the file path.

~/.claude.json

{
  "mcpServers": {
    "lodis": {
      "command": "npx",
      "args": ["-y", "lodis"]
    }
  }
}

Or add to your project's .mcp.json for per-project config.

2. System Prompt Configuration

Tell your AI to use Lodis by default. Add this snippet to your system prompt or instructions file:

Use Lodis MCP tools for all persistent memory. At the start of
conversations, call memory_search with relevant terms to retrieve
context. When the user states a preference, corrects an assumption,
shares personal context, or provides information useful across future
conversations, save it with memory_write.

Where to put it

Claude Code~/.claude/CLAUDE.md

Add to your global ~/.claude/CLAUDE.md (or project CLAUDE.md). Claude Code has a built-in auto-memory system that competes with Lodis — the snippet above explicitly disables it.

Claude Code requires a stronger override

## Memory — CRITICAL OVERRIDE

**DO NOT use the built-in file-based auto-memory system.** Never create,
read, or write to `MEMORY.md` or any files in the
`~/.claude/projects/.../memory/` directory. That system is fully
replaced by Lodis.

**USE Lodis MCP tools for ALL persistent memory.** Lodis is the
single source of truth.

### At conversation start
- Call `memory_search` with terms relevant to the user's request to
  retrieve prior context

### When to write
- `memory_write` — user states a preference, corrects an assumption,
  shares personal context, or says anything useful for future conversations
- `memory_confirm` — user validates a recalled fact
- `memory_correct` — user corrects a stored memory
- `memory_flag_mistake` — a memory turns out to be wrong

### When to search
- `memory_search` — before answering questions where prior context
  would help
- `memory_context` — token-budget-aware retrieval for building responses
- `memory_briefing` — entity summaries (people, projects, etc.)

### Rules
- Never duplicate memories to both Lodis and the built-in file system
- Treat Lodis memories as the persistent record — they survive across
  all MCP-connected tools (Claude Code, Cursor, Windsurf, Claude Desktop)
- When the user says "remember this," save immediately via `memory_write`
Claude DesktopSystem prompt in Settings

Open Settings → General → System Prompt. Paste the snippet above. Claude Desktop will include it in every conversation.

Cursor.cursorrules or Rules settings

Add to your project's .cursorrules file, or go to Settings → Rules → User Rules to set it globally.

WindsurfSystem prompt in Settings

Open Settings → AI → System Prompt. Paste the snippet above.

3. First Run — Seeding Your Memory

Once Lodis is connected, tell your AI assistant:

“Help me set up Lodis”

Your assistant will call memory_onboard and:

  1. 1
    Scan your connected tools (calendar, email, GitHub) to extract people, projects, and context
  2. 2
    Interview you with targeted questions based on what it found
  3. 3
    Seed 30-50 memories with entity types and connections

Importing existing memories

If you already have memories in other tools, your AI can import them:

Claude Code auto-memory"Import my Claude memories into Lodis"
ChatGPT memory export"Import this ChatGPT memory export into Lodis"
Cursor rules"Import my .cursorrules as Lodis preferences"
Git config"Import my gitconfig into Lodis"

Review your memories at localhost:3838. Confirm what’s right, correct what’s wrong.

4. Cloud Setup

Cloud mode syncs your memories across devices via Turso. All data encrypted at rest with AES-256-GCM.

  1. 1Sign up

    Create an account at app.lodis.ai

  2. 2Create an API token

    Go to Settings → API Tokens → Generate Token. Copy it — you’ll need it for MCP client config.

  3. 3Configure your MCP client

    Use the HTTP transport with your API token:

    {
      "mcpServers": {
        "lodis": {
          "type": "streamable-http",
          "url": "https://app.lodis.ai/api/mcp",
          "headers": {
            "Authorization": "Bearer YOUR_API_TOKEN"
          }
        }
      }
    }

Claude.ai — OAuth (zero config)

Claude.ai users can skip API tokens entirely. Connect via OAuth 2.1: go to your Claude.ai settings, add Lodis as an MCP server, and authorize through app.lodis.ai. No config files needed.

Already running locally? Use memory_migrate to move your existing memories to cloud.

5. Local HTTP Mode

Want to connect remote clients to a self-hosted Lodis instance? Run the HTTP server with Bearer token authentication:

Start the HTTP server

lodis --serve

Starts an HTTP MCP server on port 3939 with Bearer token authentication.

  1. 1Create an API token

    Open the dashboard at localhost:3838 → Settings → API Tokens → Generate Token.

  2. 2Configure your remote client
    {
      "mcpServers": {
        "lodis": {
          "type": "streamable-http",
          "url": "http://localhost:3939/mcp",
          "headers": {
            "Authorization": "Bearer YOUR_API_TOKEN"
          }
        }
      }
    }

This gives you the same remote access as cloud mode, but running entirely on your own machine.