Skip to content

A new package that transforms unstructured text queries into highly structured, search-optimized outputs. The package takes a user's text input, such as a search query or a description of information

Notifications You must be signed in to change notification settings

chigwell/structquery

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

structquery

PyPI version License: MIT Downloads LinkedIn

structquery is a lightweight Python package that converts unstructured text queries into highly structured, search‑optimized outputs. By leveraging the llmatch‑messages utility and a default LLM7 model, the package extracts key elements from user input (e.g., location, difficulty, length) and returns them in a format that is ready for downstream search engines or information‑retrieval pipelines.


✨ Features

  • One‑line transformation from free‑form text to a list of structured fields.
  • Built‑in LLM7 support (ChatLLM7 from langchain_llm7) with automatic API‑key handling.
  • Pluggable LLMs – you can swap in any LangChain‑compatible chat model (OpenAI, Anthropic, Google, etc.).
  • Regex‑driven output validation via llmatch to guarantee format correctness.
  • Zero‑config defaults – works out‑of‑the‑box for common use‑cases.

📦 Installation

pip install structquery

🚀 Quick Start

from structquery import structquery

# Simple call – uses the default ChatLLM7 model
response = structquery(
    user_input="best hiking trails in Colorado"
)

print(response)               # → List of extracted structured strings

Parameters

Name Type Description
user_input str The raw user query or description you want to structure.
llm Optional[BaseChatModel] Your own LangChain chat model. If omitted, the package creates a ChatLLM7 instance automatically.
api_key Optional[str] LLM7 API key. If omitted, the package reads LLM7_API_KEY from the environment; otherwise it falls back to the placeholder "None" (which triggers an error from the LLM service).

🔧 Using a Custom LLM

You can provide any LangChain‑compatible chat model that inherits from BaseChatModel. Below are examples for the most popular providers.

OpenAI

from langchain_openai import ChatOpenAI
from structquery import structquery

llm = ChatOpenAI(model="gpt-4o-mini")
response = structquery(
    user_input="latest smartphones under $500",
    llm=llm
)

Anthropic

from langchain_anthropic import ChatAnthropic
from structquery import structquery

llm = ChatAnthropic(model_name="claude-3-haiku-20240307")
response = structquery(
    user_input="affordable vegan meal plan",
    llm=llm
)

Google Generative AI

from langchain_google_genai import ChatGoogleGenerativeAI
from structquery import structquery

llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = structquery(
    user_input="top-rated bike touring routes in Europe",
    llm=llm
)

🔑 LLM7 API Key

  • The free tier of LLM7 supplies generous rate limits that satisfy most experimentation and production needs.
  • To obtain a free API key, register at https://token.llm7.io/.
  • You can provide the key in three ways:
    1. Environment variable: export LLM7_API_KEY=your_key
    2. Direct argument: structquery(user_input, api_key="your_key")
    3. Pass a pre‑configured ChatLLM7 instance via the llm argument.

🐞 Issues & Contributions


📝 License

Distributed under the MIT License. See the LICENSE file for details.


📧 Contact

Eugene Evstafevhi@euegne.plus
GitHub: https://github.com/chigwell