HubLensMCPmksglu/context-mode
mksglu

context-mode

AIMCPLLMContext ManagementAgentic WorkflowSQLite
View on GitHub
40

// summary

Context Mode is an MCP server designed to prevent context window exhaustion by offloading raw data into a sandboxed SQLite database. It tracks session events and uses BM25 search to retrieve only relevant information, ensuring the LLM maintains continuity during conversation compaction. Additionally, it encourages a code-first approach where agents write scripts to process data, significantly reducing token consumption.

// technical analysis

Context Mode is an MCP server designed to optimize LLM context windows by preventing the accumulation of raw data from tool calls. It employs a sandbox-first architecture that offloads data processing to SQLite, utilizing FTS5 and BM25 search to retrieve only relevant information during conversation compaction. By shifting the paradigm from treating LLMs as data processors to code generators, the project significantly reduces token consumption and maintains session continuity across various AI development platforms.

// key highlights

01
Reduces context usage by up to 98% by sandboxing raw tool data instead of dumping it directly into the context window.
02
Maintains session continuity by tracking file edits, git operations, and tasks in a SQLite database for intelligent retrieval.
03
Implements a 'Think in Code' philosophy where the LLM writes scripts to process data, replacing multiple tool calls with efficient code execution.
04
Provides a suite of utility commands like ctx-stats and ctx-insight to monitor token savings and analyze agent performance metrics.
05
Supports seamless integration across multiple platforms including Claude Code, Gemini CLI, VS Code Copilot, and Cursor via specialized hooks and routing configurations.

// use cases

01
Context saving by offloading raw tool data to a sandboxed SQLite database
02
Session continuity through indexed event tracking and relevant retrieval
03
Think-in-code paradigm to replace multiple tool calls with efficient scripts

// getting started

To begin, install the package globally using 'npm install -g context-mode'. Depending on your platform, configure the MCP server and hooks by following the specific instructions provided in the README for your environment (e.g., adding to 'mcpServers' in your config file). Once installed, verify the setup by running 'ctx doctor' or 'ctx stats' within your agent's chat interface.