HubLensLLMAlishahryar1/free-claude-code
Alishahryar1

free-claude-code

AI#LLM#Claude#Proxy#Ollama#Automation
View on GitHub
620

// summary

Free Claude Code is a lightweight proxy that allows developers to use the Claude Code CLI and VSCode extension without an Anthropic API key. It routes requests to various providers including NVIDIA NIM, OpenRouter, DeepSeek, and local LLM runtimes like Ollama or LM Studio. The tool features per-model routing, request optimization, and support for advanced features like thinking tokens and structured tool parsing.

// technical analysis

Free Claude Code is a lightweight, transparent proxy designed to route Anthropic API requests from the Claude Code CLI or VSCode extension to various alternative LLM providers. By intercepting standard API calls, it enables users to bypass Anthropic API key requirements and utilize free or local models from sources like NVIDIA NIM, Ollama, and OpenRouter. The project prioritizes flexibility through per-model routing and request optimization, effectively trading off direct Anthropic integration for significant cost savings and local privacy control.

// key highlights

01
Provides a zero-cost alternative by routing requests to free tiers on NVIDIA NIM and OpenRouter or fully local models.
02
Acts as a drop-in replacement that requires no modifications to the original Claude Code CLI or VSCode extension.
03
Supports advanced request optimization by intercepting trivial API calls locally to reduce latency and save quota.
04
Includes a heuristic tool parser that automatically converts text-based model outputs into structured tool use.
05
Features a Discord and Telegram bot for remote, autonomous coding with session persistence and live progress tracking.
06
Enables per-model mapping, allowing users to route different Claude models to specific providers simultaneously.

// use cases

01
Run Claude Code CLI and VSCode extensions for free using alternative LLM providers.
02
Route requests to local models via Ollama, LM Studio, or llama.cpp for privacy and offline usage.
03
Manage autonomous coding sessions remotely using integrated Discord or Telegram bot support.

// getting started

To begin, install the project using uv or by cloning the repository and configuring your environment variables in a .env file. Start the proxy server using uvicorn, then point your Claude Code CLI or VSCode extension's ANTHROPIC_BASE_URL to the proxy's local address. You can then use the provided claude-pick tool or direct configuration to select your preferred LLM provider.