Documentation · Self-host guide

From git clone
to first session.

Everything you need to run your own tutor on a small VPS, plug it into your favorite AI assistant, and start learning, in roughly ten minutes.

v0.3 alphaMIT licensedGo 1.25SQLite, no CGOgithub.com/ArnaudGuiovanna/tutor-mcp
On this page
Get going

Quick start

The fastest path: clone, build, run, connect. You'll have your tutor reachable from your AI assistant in under ten minutes.

i.
Clone & build
One go build, single binary, no CGO.
ii.
Set the secret
A 32+ char JWT_SECRET, nothing else mandatory.
iii.
Expose HTTPS
Caddy, Nginx, or Tailscale Funnel, you pick.
iv.
Plug your AI
Add the URL as a custom MCP connector.
Just want to test locally? Skip step iii and use Claude Code with http://localhost:3000/mcp. No domain, no TLS, no public IP needed.
Requirements

What you need

Tutor MCP is intentionally lean. A 2 GB VPS handles a small classroom; your laptop handles personal use.

Runtime
Go 1.25 + SQLite (pure Go)
OS
Linux (macOS & Windows for dev)
Memory
512 MB min / 2 GB recommended
Disk
2 GB SSD, room for SQLite + backups
Public endpoint
HTTPS required for web AI clients
Capacity
~200 learners single tenant, single node
Web AI assistants (Claude.ai, ChatGPT, Le Chat) require a public HTTPS URL with a valid certificate. CLI clients (Claude Code, Cline, Continue) work fine over localhost.
Run the server

Install & build

Two paths to a running binary, pick the one that fits. The pre-built binary is the fastest; building from source is for contributors or anyone who wants to pin to a commit.

Download a binary

Single binary, no toolchain. Linux & macOS, amd64 & arm64. The example below grabs Linux amd64.

$ curl -sSL https://github.com/ArnaudGuiovanna/tutor-mcp/releases/download/v0.3.0-alpha.1/tutor-mcp_v0.3.0-alpha.1_linux_amd64.tar.gz | tar -xz
$ ./tutor-mcp
[info] tutor-mcp listening on :3000
[info] db opened at ./data/runtime.db

For other platforms, swap linux_amd64 for linux_arm64, darwin_amd64, or darwin_arm64. Full list and SHA256SUMS on the release page — verify with sha256sum -c SHA256SUMS if you care.

Pin the version in production. The URL above is tagged v0.3.0-alpha.1; bump it deliberately when a new release ships rather than chasing latest.

Build from source

The whole tree is one Go module; no Docker, no Node, no Python. Useful if you want to contribute or pin to a specific commit.

$ git clone https://github.com/ArnaudGuiovanna/tutor-mcp.git
$ cd tutor-mcp
$ go build -o tutor-mcp
$ ./tutor-mcp
[info] tutor-mcp listening on :3000
[info] db opened at ./data/runtime.db

Environment & config

The binary reads its configuration from environment variables. Only JWT_SECRET is mandatory.

# .env or systemd EnvironmentFile
JWT_SECRET=replace-with-32-char-random-string
DB_PATH=./data/runtime.db
BACKUP_DIR=./backups
BACKUP_RETENTION_DAYS=14
PORT=3000
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/… # optional

Reverse proxy & HTTPS

Web AI clients require HTTPS. Caddy gives you automatic Let's Encrypt with two lines:

# /etc/caddy/Caddyfile
tutor.your-domain.com {
  reverse_proxy localhost:3000
}

Reload Caddy (caddy reload) and your tutor is live at https://tutor.your-domain.com/mcp. That's the URL you'll paste into your AI provider.

Run as a service

A user systemd unit keeps things running and ties cleanly into the documented backup timer.

$ systemctl --user enable --now tutor-mcp
$ systemctl --user enable --now tutor-mcp-backup.timer
$ journalctl --user -u tutor-mcp -f
Connect a provider

Plug into your AI

Once your server is reachable, registering it inside your assistant takes about a minute. The wording differs between providers; the moves don't.

Claude (claude.ai), Pro / Max / Team / Enterprise

  1. Open SettingsConnectors.
  2. Click the + next to Connectors.
  3. Fill in: Name = Tutor MCP, Server URL = https://your.domain/mcp.
  4. Click Add and complete the OAuth login.
Free tier doesn't expose connectors yet, you'll need a Pro plan minimum.
Claude Code (CLI)

Plug into Claude Code

If you're using Claude Code in your terminal, drop a .mcp.json file in your project root (or ~/.claude/mcp.json globally) and you're connected.

{
  "mcpServers": {
    "tutor-mcp": {
      "type": "http",
      "url": "http://localhost:3000/mcp"
    }
  }
}

Swap localhost:3000 for your.domain if your server isn't local.

Other clients

Local & alternative clients

The MCP protocol is open. Tutor MCP speaks plain HTTP MCP, so any client that supports custom MCP servers works, including those wired to local models.

  • Cline, VS Code extension, supports MCP and local LLMs.
  • Continue, IDE assistant with MCP support.
  • OpenWebUI, self-hosted ChatGPT-style frontend.
  • Custom client, any LLM that can call MCP tools (Ollama-backed, llama.cpp, vLLM…).
The cognitive engine doesn't care which model is on the other end, it only cares that something can call its tools.
Verify

Test your setup

Once your provider is connected, send this prompt in a fresh chat. The assistant should call two MCP tools and reply with a short pedagogical brief.

Turn-key check

Drop this into your chat

« Trigger Tutor MCP. Set me up to learn Go in three weeks. Lecture-led style. »
The assistant calls get_learner_context
The assistant calls get_cockpit_state (or sets up a domain first)
You see a coaching reply that names a concept and a phase

If any step fails, jump to Troubleshooting below.

Operate

Backup & restore

Your learner model lives in a single SQLite file. Two systemd units handle online backups; an off-host copy (Tailscale rsync, S3, or scheduled SSH pull) keeps you safe from a dead disk.

$ systemctl --user enable --now tutor-mcp-backup.timer
$ systemctl --user start tutor-mcp-backup.service

To restore from a snapshot:

$ systemctl --user stop tutor-mcp
$ mv ./data/runtime.db ./data/runtime.db.broken-$(date -u +%FT%TZ)
$ rm -f ./data/runtime.db-shm ./data/runtime.db-wal
$ cp ./backups/runtime-2026-05-05T03-30-00Z.db ./data/runtime.db
$ systemctl --user start tutor-mcp
Test your restore quarterly.A backup you've never restored is a backup you don't have.
When things go sideways

Troubleshooting

The assistant never calls the tutor

Check the logs for missing pipeline decision entries:

$journalctl --user -u tutor-mcp -f | grep -E "pipeline decision|interaction recorded"

If no decisions are logged, the LLM isn't calling get_next_activity, re-trigger explicitly with: « Use Tutor MCP next_activity. »

OAuth handshake fails

Make sure your domain has a valid TLS certificate. curl -I https://your.domain/mcp should return 200 with no warning. AI providers reject self-signed certs.

Repeated phase fallback (NoFringe)

Empty candidate pool, you haven't defined a domain yet, or the goal is too narrow. Run tutor.init_domain with a goal description and three to five concept names.

Need more help? Open an issue on GitHub or check the full OPERATIONS.md runbook.