Skip to content

RokoTechnology/claugent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local AI Agent with Bun

A complete AI agent system that uses local models via Ollama instead of relying on cloud APIs.

Features

  • 🧠 Reasoning framework for solving complex problems step-by-step
  • 🛠️ Tool integration for calculations, weather info, and more
  • 💾 Memory management for conversation context and knowledge storage
  • 📋 Planning and execution capabilities for breaking down complex tasks
  • 📚 Knowledge grounding for retrieving relevant information
  • 🏠 Runs completely locally using open-source models

Prerequisites

  • Bun for running JavaScript/TypeScript
  • Ollama for local model inference

Installation

  1. Clone this repository:

    git clone https://github.com/yourusername/claude-local-agent.git
    cd claude-local-agent
  2. Install dependencies:

    bun install
  3. Install Ollama:

    • Mac: Download from ollama.com
    • Linux: curl -fsSL https://ollama.com/install.sh | sh
    • Windows: Download from ollama.com
  4. Pull a model:

    ollama pull llama3
  5. Configure environment variables in .env file:

    LOCAL_MODEL=llama3
    OLLAMA_API_URL=http://localhost:11434
    PORT=3000
    

Usage

CLI Interface

bun cli

Web Interface

bun dev

Then open your browser to http://localhost:3000

Available Models

Ollama supports many open-source models. Some popular ones:

  • llama3 - Meta's Llama 3 (8B parameters)
  • llama3:70b - Larger Llama 3 model (70B parameters)
  • mistral - Mistral 7B
  • phi3 - Microsoft's Phi-3 model
  • gemma - Google's Gemma model

See all available models:

ollama list

Creating Custom Models

Create a Modelfile:

FROM llama3

# Set parameters for better reasoning
PARAMETER temperature 0.2
PARAMETER top_p 0.9

# System prompt
SYSTEM You are an AI assistant designed for accurate reasoning and problem-solving.

Build the custom model:

ollama create reasoning-llama -f ./Modelfile

Use it by setting LOCAL_MODEL=reasoning-llama in .env

Project Structure

  • src/ai/localModel.ts - Interface to local Ollama models
  • src/core/agent.ts - Main agent implementation
  • src/core/reasoner.ts - Reasoning framework
  • src/tools/toolManager.ts - Tool integration
  • src/memory/memoryManager.ts - Conversation and knowledge storage
  • src/knowledge/knowledgeManager.ts - Knowledge retrieval
  • src/cli.ts - Command-line interface
  • src/index.ts - Web server
  • public/index.html - Web interface

About

Local AI Agent for Mac

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published