Skip to content

Add a local RAG skill using Ollama for document Q&A #237

@twishapatel12

Description

@twishapatel12

Feature Proposal

I’d like to propose adding a new local RAG (Retrieval-Augmented Generation) skill to OpenWork, using Ollama as the LLM backend for document-based question answering.

Motivation

A common use case for OpenWork is experimenting with workflows locally and privately. Many users want to:

  • Run RAG pipelines fully locally
  • Avoid reliance on external APIs
  • Query private documents securely

Ollama makes local LLM usage simple and accessible, which feels aligned with OpenWork’s workflow- and skill-based approach.

Proposed Addition

Add a new skill folder, tentatively named:

rag-workflows/

This skill would include:

  • SKILL.md describing the capability, requirements, and usage
  • A prompt file for document Q&A
  • One or more example workflows demonstrating local document retrieval + Q&A using Ollama

Scope (Initial Version)

  • Local document Q&A only
  • Ollama as the LLM provider
  • Minimal, focused implementation
  • No breaking changes to existing workflows

The goal is to keep this small and consistent with existing skill conventions.

Implementation

If this proposal sounds reasonable, I’m happy to implement it and submit a focused PR following the current repository structure and conventions.

Before starting, I wanted to confirm:

  • Whether Ollama is an acceptable dependency for a skill
  • Whether the proposed structure/naming makes sense

Feedback welcome — happy to adjust the approach.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions