# learning-mcp HTTP MCP server exposing the pgvector learnings DB. Replaces direct `psql`/`ssh` access used by `rag-query`, `learn-seed`, `learn-inject`, `learn-context`. - Transport: streamable HTTP (`/mcp`) - Port: `9221` - Backend: `learnings_embeddings` on the infra postgres container - Embeddings: LiteLLM `/v1/embeddings` (`text-embedding-3-small`, 1536 dims) ## Tools | Tool | Replaces | |---|---| | `learning_query` | `rag-query` | | `learning_search_fts` | `pg_search_fts` in `pglearn.sh` | | `learning_inject` | `learn-inject` | | `learning_context` | `learn-context` | | `learning_add` | `pg_insert_learning` | | `learning_mark_applied` | `pg_mark_applied` | ## Env ``` PGHOST=postgres # docker service name on infra PGPORT=5432 PGUSER=agiliton PGPASSWORD=... # from vault ag.postgres.agiliton_password PGDATABASE=agiliton LITELLM_URL=https://llm.agiliton.cloud LITELLM_API_KEY=... # vault litellm.ai_coder_key or similar EMBEDDING_MODEL=text-embedding-3-small SENTRY_DSN= # optional ``` ## Local dev ``` npm install LITELLM_API_KEY=$(vault get litellm.ai_coder_key) \ PGHOST=localhost PGPORT=5433 \ node src/http-server.js curl -s http://127.0.0.1:9221/health ``` ## Deploy Built as `gitea.agiliton.internal:3000/christian/learning-mcp:latest` and attached to the infra docker-compose stack with the postgres network. Registered in `/opt/apps/litellm/config.yaml` as a streamable-HTTP MCP server (CF-3081 pattern).