The deployed image (1.1.0) was drifted from the initial commit — it had
7 extra tools (update/archive/restore/check_extracted/mark_extracted/
rate_application/get), a /api/{tool} REST shim for bash clients, and
hoisted callTool + TOOLS exports. None of this was in source control,
so a fresh rebuild or Watchtower pull would have wiped the additions.
Capture the deployed source verbatim, then add format:json to
learning_query (matching learning_search_fts) so rag-query can consume
structured results via the gateway. Bumps to 1.2.0.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
learning-mcp
HTTP MCP server exposing the pgvector learnings DB. Replaces direct psql/ssh access used by rag-query, learn-seed, learn-inject, learn-context.
- Transport: streamable HTTP (
/mcp) - Port:
9221 - Backend:
learnings_embeddingson the infra postgres container - Embeddings: LiteLLM
/v1/embeddings(text-embedding-3-small, 1536 dims)
Tools
| Tool | Replaces |
|---|---|
learning_query |
rag-query |
learning_search_fts |
pg_search_fts in pglearn.sh |
learning_inject |
learn-inject |
learning_context |
learn-context |
learning_add |
pg_insert_learning |
learning_mark_applied |
pg_mark_applied |
Env
PGHOST=postgres # docker service name on infra
PGPORT=5432
PGUSER=agiliton
PGPASSWORD=... # from vault ag.postgres.agiliton_password
PGDATABASE=agiliton
LITELLM_URL=https://llm.agiliton.cloud
LITELLM_API_KEY=... # vault litellm.ai_coder_key or similar
EMBEDDING_MODEL=text-embedding-3-small
SENTRY_DSN= # optional
Local dev
npm install
LITELLM_API_KEY=$(vault get litellm.ai_coder_key) \
PGHOST=localhost PGPORT=5433 \
node src/http-server.js
curl -s http://127.0.0.1:9221/health
Deploy
Built as gitea.agiliton.internal:3000/christian/learning-mcp:latest and attached to the infra docker-compose stack with the postgres network. Registered in /opt/apps/litellm/config.yaml as a streamable-HTTP MCP server (CF-3081 pattern).
Description
Languages
JavaScript
98.7%
Dockerfile
1.3%