NEWOpen source release v0.3 · read announce

Turn any LLM into an intelligent tutor.

Self-learning is a superpower. Tutor MCP is an open-source Model Context Protocol server that brings 50 years of cognitive science into your favorite AI assistant.

Install in 30 seconds
# tutor-mcp · linux amd64 · single binary, no toolchain
$curl -sSL https://github.com/ArnaudGuiovanna/tutor-mcp/releases/download/v0.3.0-alpha.1/tutor-mcp_v0.3.0-alpha.1_linux_amd64.tar.gz | tar -xz
$./tutor-mcp
Works with
AnthropicAnthropicOpenAIOpenAIMistralMistralGeminiGemini
Learning Go
Tutor MCP connected
A
You
Trigger Tutor MCP, I want to keep learning Go.
Assistant
TRead your learner context get_learner_context
TPulled cockpit signals get_cockpit_state

Welcome back. You're on a 6-day streak and your autonomy is trending up, nice.

Heads up: goroutines have slipped to 41% retention after two days off. The spaced-repetition model is asking for a quick review before we open channels. I also noticed a tiny over-confidence bias, so we'll add a one-line prediction step before each answer.

Open Learner Model · Go
Phase · Instruction
Mastery
5/12
Autonomy
0.62↑ +.08
Calibration
+0.04over
Streak
6don
Concepts
Variables & typessolid
Slices & mapssolid
Error handlingstrong
Interfacesin progress
Goroutinesdue now
Channelslocked
Context & cancellationnot started

Next up → a 5-minute goroutines refresher, then we unlock channels. Ready?

Send a message…
What you can do with it

Six ways people are using their tutor.

No course catalog. No textbook. Tell your AI what you want to learn, the tutor handles sequencing, pacing, reviews, and knows when you've actually got it.

Pick up something new from scratch

« I want to learn Spanish before my trip in six weeks. »Tell your AI your goal and your deadline. The tutor maps the path, paces what comes next, and only opens new ground when the basics are solid.

Languages · coding · a new craft

Bring rusty knowledge back to life

« I studied statistics 8 years ago. »A short diagnostic exposes what's still solid, what's faded, and where the small wrong ideas hide. You revisit only what needs it, not a whole textbook.

Diagnostic · targeted recall

Prep for an exam, interview, or certification

« AWS Solutions Architect in three weeks. »Backwards-planned from your test date. Drills focus on your weak spots; reviews peak right before D-day, when memory matters most.

Bar exam · coding interview · AWS

Five focused minutes a day

« Just one thing today. »Open the chat in the morning. The tutor knows exactly which concept to revisit, delivers a single calibrated exercise, and you're done. No streaks, no homework log, no decision fatigue.

Microlearning · daily habit

Build your own bootcamp or e-learning track

« 12 weeks of data engineering, full-time, starting Monday. »Set the curriculum, the depth, and the pace. The tutor sequences modules on prerequisites, runs daily sessions, and checks mastery between phases. All the structure of a paid bootcamp or e-learning course, none of the cohort calendar.

Self-bootcamp · e-learning · self-paced

Turn any document into a course

« Turn this PDF into a six-week course. »Drop a chapter, a paper, lecture notes, and ask for the style: lecture-led, hands-on practice, Socratic dialogue, theory-first, or oral drill. The tutor maps the ideas and shapes a path that fits both the material and how you want to learn it.

PDF · pedagogical style on demand
Pedagogical styles you can ask for
Lecture-ledHands-on practiceSocratic dialogueTheory-firstOral drillProject-basedCase studiesWorked examples
Architecture

An LLM is brilliant. A tutor is rigorous.

Tutor MCP splits the brain in two. A deterministic cognitive engine decides whatto teach next. An LLM does what it's best at: explain, reframe, encourage. Two halves, one tutor.

Deterministic engine

Tutor MCP

50 years of cognitive science compiled into five algorithms running in parallel loops. Predictable. Auditable. No hallucinations on what you've actually mastered.

BKTFSRSIRTPFAKSTPhase FSMMotivation engineMetacognitive loop
Generative coach

Your LLM

Trained on humanity's knowledge. Generates exercises on demand, calibrated by the engine's signals. Explains, encourages, reframes, with your full learner context in the prompt, every turn.

Content generationNatural languageCoachingMisconception diagnosisWebhook nudges
BKT
Bayesian Knowledge Tracing

Tells the tutor how confident it should be that you've actually mastered a concept, not just that you happened to answer right today.

FSRS
Free Spaced Repetition Scheduler

Decides when to bring a concept back, using stability and difficulty curves drawn from memory research.

IRT
Item Response Theory

Calibrates each exercise to your current ability, no more "too easy" or "too hard" guesses.

PFA
Performance Factor Analysis

Weighs your wins and losses on each concept to predict how the next attempt is likely to go.

KST
Knowledge Space Theory

Ensures concepts unlock only when their prerequisites are solid. No castle on sand.

AI-native

The engine decides what to teach. Your AI teaches it ten different ways.

Determinism alone is rigid. LLMs alone drift. Together, they're the first tutor that's both rigorous and infinitely adaptable, built on the model you already trust.

Infinite content

One concept, ten exercises, never the same. No item bank to curate, no editorial team to hire.

A hundred ways to explain

Analogy, theory, story, code, real-world example. The LLM keeps reframing until it actually clicks.

Any language, any tone

Spanish travel phrases or Latin grammar. Formal exam prep or playful drilling. The tutor matches you.

Multi-modal by default

Reads your PDF, your handwritten notes, the diagram on your whiteboard, the code in your repo.

Open source

Your tutor.
Your data. Your server.

No telemetry. No SaaS lock-in. No usage caps. Self-host on a 2 GB VPS and own every byte of your learner model.

Free forever.
Always yours.

Go 1.25, SQLite, single binary. Twenty minutes to get from git clone to your first session.

MIT
Permissive license, fork, modify, ship, sell.
~200
Active learners on a single node, single tenant.
2 GB
RAM, recommended self-host footprint.
Get started

From clone to first
session in ten minutes.

Self-host on your own machine or a small VPS. Connect the AI you already use. Start learning. No accounts, no waitlist, no fees.

01

Run the server

Clone the repo, build the single Go binary, set a JWT secret. One command later, your tutor is live on a 2 GB VPS, or your laptop.

$ git clone github.com/ArnaudGuiovanna/tutor-mcp
$ go build -o tutor-mcp
$ JWT_SECRET=… ./tutor-mcp
02

Connect your AI

Add Tutor MCP as a custom MCP connector inside the assistant you already pay for. OAuth handles the handshake.

AnthropicAnthropic
OpenAIOpenAI
MistralMistral
GeminiGemini
03

Tell it what you want to learn

Open your assistant, trigger the tutor, name your goal. From there, the cognitive engine takes over, pacing, gating, reviewing, encouraging.

Trigger Tutor MCP, I want to learn Go in three weeks.
FAQ

Questions you might
have.

Yes, Tutor MCP doesn't ship an LLM, it plugs into yours. You connect it as a custom MCP server inside the assistant you already use (Claude, ChatGPT, Le Chat, Gemini). Your conversations stay with that provider; the tutor only stores your learner model on your server.
Anki is a flashcard scheduler. Duolingo is an item bank with gamification. Most "LLM tutor" apps are a single prompt around a chatbot. Tutor MCP is the missing piece between them: a real Intelligent Tutoring System runtime, five algorithms tracking mastery, retention, ability, autonomy and calibration in real time, wrapped behind MCP so any LLM becomes the front-end.
Self-learning
is a superpower.

LLMs unlocked humanity's knowledge for everyone, but knowing isn't learning. Tutor MCP brings 50 years of intelligent tutoring research back into reach, so anyone, anywhere can build durable, deep understanding from any AI conversation. Self-hosted. Open source. Free, forever.

Built by Arnaud Guiovanna · MIT licensed · v0.3 alpha