sqlite-slim

Sqlite MCP server optimized for AI assistants — Reduce context window tokens by 32.4% while keeping full functionality. Compatible with Claude, ChatGPT, Gemini, Cursor, and all MCP clients.

npm version Test Status MCP Compatible

What is sqlite-slim?

A token-optimized version of the Sqlite Model Context Protocol (MCP) server.

The Problem

MCP tool schemas consume significant context window tokens. When AI assistants like Claude or ChatGPT load MCP tools, each tool definition takes up valuable context space.

The original mcp-server-sqlite loads 6 tools consuming approximately ~3,685 tokens — that’s space you could use for actual conversation.

The Solution

sqlite-slim intelligently groups 6 tools into 4 semantic operations, reducing token usage by 32.4% — with zero functionality loss.

Your AI assistant sees fewer, smarter tools. Every original capability remains available.

Performance

Metric Original Slim Reduction
Tools 6 4 -33%
Schema Tokens 265 211 20.4%
Claude Code (est.) ~3,685 ~2,491 ~32.4%

Benchmark Info

Quick Start

# Claude Desktop - auto-configure
npx sqlite-slim --setup claude

# Cursor - auto-configure
npx sqlite-slim --setup cursor

# Interactive mode (choose your client)
npx sqlite-slim --setup

Done! Restart your app to use sqlite.

CLI Tools (already have CLI?)

# Claude Code (creates .mcp.json in project root)
claude mcp add sqlite -s project -- npx -y sqlite-slim@latest

# Windows: use cmd /c wrapper
claude mcp add sqlite -s project -- cmd /c npx -y sqlite-slim@latest

# VS Code (Copilot, Cline, Roo Code)
code --add-mcp '{"name":"sqlite","command":"npx","args":["-y","sqlite-slim@latest"]}'

Manual Setup

Click to expand manual configuration options ### Claude Desktop Add to your `claude_desktop_config.json`: | OS | Path | |----|------| | Windows | `%APPDATA%\Claude\claude_desktop_config.json` | | macOS | `~/Library/Application Support/Claude/claude_desktop_config.json` | ```json { "mcpServers": { "sqlite": { "command": "npx", "args": ["-y", "sqlite-slim@latest"] } } } ``` ### Cursor Add to `.cursor/mcp.json` (global) or `/.cursor/mcp.json` (project): ```json { "mcpServers": { "sqlite": { "command": "npx", "args": ["-y", "sqlite-slim@latest"] } } } ``` </details> ## How It Works MCPSlim acts as a **transparent bridge** between AI models and the original MCP server: ``` ┌─────────────────────────────────────────────────────────────────┐ │ Without MCPSlim │ │ │ │ [AI Model] ──── reads 6 tool schemas ────→ [Original MCP] │ │ (~3,685 tokens loaded into context) │ ├─────────────────────────────────────────────────────────────────┤ │ With MCPSlim │ │ │ │ [AI Model] ───→ [MCPSlim Bridge] ───→ [Original MCP] │ │ │ │ │ │ │ Sees 4 grouped Translates to Executes actual │ │ tools only original call tool & returns │ │ (~2,491 tokens) │ └─────────────────────────────────────────────────────────────────┘ ``` ### How Translation Works 1. **AI reads slim schema** — Only 4 grouped tools instead of 6 2. **AI calls grouped tool** — e.g., `interaction({ action: "click", ... })` 3. **MCPSlim translates** — Converts to original: `browser_click({ ... })` 4. **Original MCP executes** — Real server processes the request 5. **Response returned** — Result passes back unchanged **Zero functionality loss. 32.4% token savings.** ## Available Tool Groups | Group | Actions | |-------|---------| | `query` | 2 | | `table` | 2 | Plus **2 passthrough tools** — tools that don't group well are kept as-is with optimized descriptions. ## Compatibility - ✅ **Full functionality** — All original `mcp-server-sqlite` features preserved - ✅ **All AI assistants** — Works with Claude, ChatGPT, Gemini, Copilot, and any MCP client - ✅ **Drop-in replacement** — Same capabilities, just use grouped action names - ✅ **Tested** — Schema compatibility verified via automated tests ## FAQ ### Does this reduce functionality? **No.** Every original tool is accessible. Tools are grouped semantically (e.g., `click`, `hover`, `drag` → `interaction`), but all actions remain available via the `action` parameter. ### Why do AI assistants need token optimization? AI models have limited context windows. MCP tool schemas consume tokens that could be used for conversation, code, or documents. Reducing tool schema size means more room for actual work. ### Is this officially supported? MCPSlim is a community project. It wraps official MCP servers transparently — the original server does all the real work. ## License MIT ---

Powered by MCPSlim — MCP Token Optimizer
Reduce AI context usage. Keep full functionality.