Convert and quantize llm models
-
Updated
Dec 30, 2025 - Python
Convert and quantize llm models
🚨 Modal × Comfy! FREE UNLIMITED AI Images & Videos on ComfyUI Cloud GPUs! 😱Modal x ComfyUI: Generate pro art/videos FOREVER with $30 FREE monthly credits! No hardware needed! 30s setup → H100 power.
Emotica AI is a compassionate and therapeutic virtual assistant designed to provide empathetic and supportive conversations. It integrates a local LLaMA model for text generation, a vision model for image captioning, a RAG system for information retrieval, and emotion detection to tailor its responses.
Nectar-X-Studio is a powerful, Local AI-Inferencing application that allows the user download, create, run agents and run large language models on their own machine. With no internet connection required, Nectar ensures privacy-first, high-performance inference using cutting-edge open-source models from Hugging Face, Ollama, and beyond.
AI tool to help users research using local LLMs and automated web search.
Containerized LLM for any use-case big or small
A simple Gradio app for local translation using the GGUF versions of MADLAD-400
Add a description, image, and links to the gguf-model-support topic page so that developers can more easily learn about it.
To associate your repository with the gguf-model-support topic, visit your repo's landing page and select "manage topics."