v0.1.0 — live on npm
All the docs.
None of the bloat.
MCP server that gives AI assistants only what matters. 20x less tokens.
Open source · Star on GitHub
# Add to Claude Code
claude mcp add context45 -- npx context45-mcp
# Then just ask
"use context45 for streaming docs"
# Result: ~146 tokens, exactly what you need
Done.
Then add "use context45" to your prompts to fetch relevant docs.
20x
less tokens
300
tokens per query
6 KB
package size
Available libraries
| Library | Sections | |
|---|---|---|
| Claude API platform.claude.com | 79 | View → |
| OpenAI API developers.openai.com | 67 | View → |
How it works
01
Curated
Hand-compressed docs covering 95% of use cases. No tutorials, no fluff.
02
Indexed
Semantic search over vector embeddings finds the right chunk.
03
Served
Returns 2-3 chunks via MCP. Just the answer, 100-300 tokens.