A complete AI agent system that uses local models via Ollama instead of relying on cloud APIs.
- 🧠 Reasoning framework for solving complex problems step-by-step
- 🛠️ Tool integration for calculations, weather info, and more
- 💾 Memory management for conversation context and knowledge storage
- 📋 Planning and execution capabilities for breaking down complex tasks
- 📚 Knowledge grounding for retrieving relevant information
- 🏠 Runs completely locally using open-source models
-
Clone this repository:
git clone https://github.com/yourusername/claude-local-agent.git cd claude-local-agent -
Install dependencies:
bun install
-
Install Ollama:
- Mac: Download from ollama.com
- Linux:
curl -fsSL https://ollama.com/install.sh | sh - Windows: Download from ollama.com
-
Pull a model:
ollama pull llama3
-
Configure environment variables in
.envfile:LOCAL_MODEL=llama3 OLLAMA_API_URL=http://localhost:11434 PORT=3000
bun clibun devThen open your browser to http://localhost:3000
Ollama supports many open-source models. Some popular ones:
llama3- Meta's Llama 3 (8B parameters)llama3:70b- Larger Llama 3 model (70B parameters)mistral- Mistral 7Bphi3- Microsoft's Phi-3 modelgemma- Google's Gemma model
See all available models:
ollama listCreate a Modelfile:
FROM llama3
# Set parameters for better reasoning
PARAMETER temperature 0.2
PARAMETER top_p 0.9
# System prompt
SYSTEM You are an AI assistant designed for accurate reasoning and problem-solving.
Build the custom model:
ollama create reasoning-llama -f ./ModelfileUse it by setting LOCAL_MODEL=reasoning-llama in .env
src/ai/localModel.ts- Interface to local Ollama modelssrc/core/agent.ts- Main agent implementationsrc/core/reasoner.ts- Reasoning frameworksrc/tools/toolManager.ts- Tool integrationsrc/memory/memoryManager.ts- Conversation and knowledge storagesrc/knowledge/knowledgeManager.ts- Knowledge retrievalsrc/cli.ts- Command-line interfacesrc/index.ts- Web serverpublic/index.html- Web interface