StructuredSnip is a Python package that takes unstructured text input from users and returns structured, categorized feedback or suggestions. It uses pattern matching to ensure responses follow a consistent format, such as extracting key points, summarizing content, or organizing ideas into predefined sections.
- Pattern Matching: Ensures responses follow a consistent format.
- Customizable LLM: Uses
ChatLLM7by default but can be customized with anyBaseChatModelfrom LangChain. - Flexible Input: Accepts user input text and processes it into structured output.
- Error Handling: Provides detailed error messages for failed LLM calls.
pip install structuredsnipfrom structuredsnip import structuredsnip
user_input = "Your unstructured text here."
response = structuredsnip(user_input)
print(response)You can use any LLM compatible with LangChain's BaseChatModel. Here are examples using different LLMs:
from langchain_openai import ChatOpenAI
from structuredsnip import structuredsnip
llm = ChatOpenAI()
response = structuredsnip(user_input, llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from structuredsnip import structuredsnip
llm = ChatAnthropic()
response = structuredsnip(user_input, llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from structuredsnip import structuredsnip
llm = ChatGoogleGenerativeAI()
response = structuredsnip(user_input, llm=llm)
print(response)You can pass an API key directly or via an environment variable.
from structuredsnip import structuredsnip
user_input = "Your unstructured text here."
api_key = "your_api_key_here"
response = structuredsnip(user_input, api_key=api_key)
print(response)export LLM7_API_KEY="your_api_key_here"from structuredsnip import structuredsnip
user_input = "Your unstructured text here."
response = structuredsnip(user_input)
print(response)- user_input (str): The user input text to process.
- llm (Optional[BaseChatModel]): The LangChain LLM instance to use. Defaults to
ChatLLM7. - api_key (Optional[str]): The API key for LLM7. If not provided, it will use the environment variable
LLM7_API_KEY.
The default rate limits for LLM7's free tier are sufficient for most use cases of this package. If you need higher rate limits, you can pass your own API key via the api_key parameter or the LLM7_API_KEY environment variable. You can get a free API key by registering at LLM7.
If you encounter any issues, please report them on the GitHub issues page.
- Eugene Evstafev
- Email: hi@eugene.plus
- GitHub: chigwell