Skip to content

joshanjohn/cllama

Repository files navigation

🤖Cllama Cllama Workflow

Cllama is a python streamlit webUI for using LLM models which are available in the ollama localy.

Screenshot 2025-08-28 at 23 18 34 Screenshot 2025-08-28 at 23 19 51

Features

  • Real Time weather information using tools.
  • Real Time weather alert using tools.

Installation Guide

  1. Install Ollama locally:

    you can install using official doc https://ollama.com/download or you can run the install_ollama.sh bash file.Follow these commands:
  • chmod +x install_ollama_models.sh
  • ./install_ollama_models.sh
  1. Download a LLm model on Ollama

    Download any llm model's from the Ollama model Library https://ollama.com/search. for instance, to install IBM's Granite 3.3 model, use this command:
  • ollama run granite3.3

To see the downloaded or available LLM models in the local Ollama, use this command:

  • ollama list
  1. Install UV dependency manager for python.

    use this command to install UV using python's pip
  • pip install uv

use this command to install using Home Brew (recommended for Mac OS).

  • brew install uv
  1. Install cllama dependencies using UV.

To sync all the dependencies for the cllama project, use

  • uv sync
  1. Runing cllama

To run the cllama webUI project use this streamlit command from the base directory.

  • streamlit run src/main.py

Contribution Guidelines

a) Create a task in the project planning board (CPPB) https://github.com/users/joshanjohn/projects/17
b) use proper branch name as github task suggest, which is hash number followed by task name seperated by hypen. (eg: 4-task-title)
c) Submit a pull request (PR) before merging into main branch.
d) Must get an PR approval review to merge.

Lint

  1. In your terminal, cd into the project root
  2. Run: uv run ruff check
  3. To apply safe automatic fix, run: uv run ruff check --fix.
  4. To apply unsafe automatic fix, run: uv run ruff check --fix --unsafe-fixes

You may need to run formatter black . again after running ruff auto fix.

Format

  1. In your terminal, cd into the project root
  2. Run: uv run black .

All The Best 🙌🏻

About

A Streamlit web UI for trying out LLM models using ollama

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published