Skip to content

Fast-Editor/Lynkr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

71 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Lynkr - Claude Code Proxy with Multi-Provider Support

npm version Homebrew Tap License: Apache 2.0 Ask DeepWiki Databricks Supported AWS Bedrock OpenAI Compatible Ollama Compatible llama.cpp Compatible

Production-ready Claude Code proxy supporting 9+ LLM providers with 60-80% cost reduction through token optimization.


Overview

Lynkr is a self-hosted proxy server that unlocks Claude Code CLI , Cursor IDE and Codex Cli by enabling:

  • πŸš€ Any LLM Provider - Databricks, AWS Bedrock (100+ models), OpenRouter (100+ models), Ollama (local), llama.cpp, Azure OpenAI, Azure Anthropic, OpenAI, LM Studio
  • πŸ’° 60-80% Cost Reduction - Built-in token optimization with smart tool selection, prompt caching, and memory deduplication
  • πŸ”’ 100% Local/Private - Run completely offline with Ollama or llama.cpp
  • 🎯 Zero Code Changes - Drop-in replacement for Anthropic's backend
  • 🏒 Enterprise-Ready - Circuit breakers, load shedding, Prometheus metrics, health checks

Perfect for:

  • Developers who want provider flexibility and cost control
  • Enterprises needing self-hosted AI with observability
  • Privacy-focused teams requiring local model execution
  • Teams seeking 60-80% cost reduction through optimization

Quick Start

Installation

Option 1: NPM Package (Recommended)

# Install globally
npm install -g lynkr

# Or run directly with npx
npx lynkr

Option 2: Git Clone

# Clone repository
git clone https://github.com/vishalveerareddy123/Lynkr.git
cd Lynkr

# Install dependencies
npm install

# Create .env from example
cp .env.example .env

# Edit .env with your provider credentials
nano .env

# Start server
npm start

Option 3: Docker

docker-compose up -d

Supported Providers

Lynkr supports 9+ LLM providers:

Provider Type Models Cost Privacy
AWS Bedrock Cloud 100+ (Claude, Titan, Llama, Mistral, etc.) $$-$$$ Cloud
Databricks Cloud Claude Sonnet 4.5, Opus 4.5 $$$ Cloud
OpenRouter Cloud 100+ (GPT, Claude, Llama, Gemini, etc.) $-$$ Cloud
Ollama Local Unlimited (free, offline) FREE πŸ”’ 100% Local
llama.cpp Local GGUF models FREE πŸ”’ 100% Local
Azure OpenAI Cloud GPT-4o, GPT-5, o1, o3 $$$ Cloud
Azure Anthropic Cloud Claude models $$$ Cloud
OpenAI Cloud GPT-4o, o1, o3 $$$ Cloud
LM Studio Local Local models with GUI FREE πŸ”’ 100% Local

πŸ“– Full Provider Configuration Guide


Claude Code Integration

Configure Claude Code CLI to use Lynkr:

# Set Lynkr as backend
export ANTHROPIC_BASE_URL=http://localhost:8081
export ANTHROPIC_API_KEY=dummy

# Run Claude Code
claude "Your prompt here"

That's it! Claude Code now uses your configured provider.

πŸ“– Detailed Claude Code Setup


Cursor Integration

Configure Cursor IDE to use Lynkr:

  1. Open Cursor Settings

    • Mac: Cmd+, | Windows/Linux: Ctrl+,
    • Navigate to: Features β†’ Models
  2. Configure OpenAI API Settings

    • API Key: sk-lynkr (any non-empty value)
    • Base URL: http://localhost:8081/v1
    • Model: claude-3.5-sonnet (or your provider's model)
  3. Test It

    • Chat: Cmd+L / Ctrl+L
    • Inline edits: Cmd+K / Ctrl+K
    • @Codebase search: Requires embeddings setup

Codex CLI with Lynkr

Configure Codex Cli to use Lynkr
Option 1: Environment Variable (simplest)

export OPENAI_BASE_URL=http://localhost:8081/v1                                                                                                                                                                                                    
 export OPENAI_API_KEY=dummy                                                                                                                                                                                                                        
 codex 

Option 2: Config File (~/.codex/config.toml)

model_provider = "lynkr"                                                                                                                                                                                                                           
                                                                                                                                                                                                                                                   
[model_providers.lynkr]                                                                                                                                                                                                                            
name = "Lynkr Proxy"                                                                                                                                                                                                                               
base_url = "http://localhost:8081/v1"                                                                                                                                                                                                              
env_key = "OPENAI_API_KEY"     

Lynkr also supports Cline, Continue.dev and other OpenAI compatible tools.


Documentation

Getting Started

IDE Integration

Features & Capabilities

Deployment & Operations

Support


External Resources


Key Features Highlights

  • βœ… Multi-Provider Support - 9+ providers including local (Ollama, llama.cpp) and cloud (Bedrock, Databricks, OpenRouter)
  • βœ… 60-80% Cost Reduction - Token optimization with smart tool selection, prompt caching, memory deduplication
  • βœ… 100% Local Option - Run completely offline with Ollama/llama.cpp (zero cloud dependencies)
  • βœ… OpenAI Compatible - Works with Cursor IDE, Continue.dev, and any OpenAI-compatible client
  • βœ… Embeddings Support - 4 options for @Codebase search: Ollama (local), llama.cpp (local), OpenRouter, OpenAI
  • βœ… MCP Integration - Automatic Model Context Protocol server discovery and orchestration
  • βœ… Enterprise Features - Circuit breakers, load shedding, Prometheus metrics, K8s health checks
  • βœ… Streaming Support - Real-time token streaming for all providers
  • βœ… Memory System - Titans-inspired long-term memory with surprise-based filtering
  • βœ… Tool Calling - Full tool support with server and passthrough execution modes
  • βœ… Production Ready - Battle-tested with 400+ tests, observability, and error resilience

Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Claude Code CLI β”‚  or  Cursor IDE
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚ Anthropic/OpenAI Format
         ↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Lynkr Proxy    β”‚
β”‚  Port: 8081     β”‚
β”‚                 β”‚
β”‚ β€’ Format Conv.  β”‚
β”‚ β€’ Token Optim.  β”‚
β”‚ β€’ Provider Routeβ”‚
β”‚ β€’ Tool Calling  β”‚
β”‚ β€’ Caching       β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β”œβ”€β”€β†’ Databricks (Claude 4.5)
         β”œβ”€β”€β†’ AWS Bedrock (100+ models)
         β”œβ”€β”€β†’ OpenRouter (100+ models)
         β”œβ”€β”€β†’ Ollama (local, free)
         β”œβ”€β”€β†’ llama.cpp (local, free)
         β”œβ”€β”€β†’ Azure OpenAI (GPT-4o, o1)
         β”œβ”€β”€β†’ OpenAI (GPT-4o, o3)
         └──→ Azure Anthropic (Claude)

πŸ“– Detailed Architecture


Quick Configuration Examples

100% Local (FREE)

export MODEL_PROVIDER=ollama
export OLLAMA_MODEL=qwen2.5-coder:latest
export OLLAMA_EMBEDDINGS_MODEL=nomic-embed-text
npm start

AWS Bedrock (100+ models)

export MODEL_PROVIDER=bedrock
export AWS_BEDROCK_API_KEY=your-key
export AWS_BEDROCK_MODEL_ID=anthropic.claude-3-5-sonnet-20241022-v2:0
npm start

OpenRouter (simplest cloud)

export MODEL_PROVIDER=openrouter
export OPENROUTER_API_KEY=sk-or-v1-your-key
npm start

πŸ“– More Examples


Contributing

We welcome contributions! Please see:


License

Apache 2.0 - See LICENSE file for details.


Community & Support


Made with ❀️ by developers, for developers.