-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Description
System Info
I discovered a critical discrepancy in the prompt rendering pipeline for Agents. The custom instructions passed via agent.description are correctly extracted in the Python backend but are discarded before reaching the LLM due to a missing template placeholder "System Prompt" . This bug renders the description parameter functionally useless for complex queries.
The System Prompt (passed as the description parameter) is not included in the final prompt string when generating code that uses the GeneratePythonCodeWithSQLPrompt class. The LLM receives the data schema () but not the crucial guiding instructions.
🐛 Describe the bug
def generate_code(self, query: Union[UserQuery, str]) -> str:
"""Generate code using the LLM."""
self._state.memory.add(str(query), is_user=True)
self._state.logger.log("Generating new code...")
prompt = get_chat_prompt_for_sql(self._state)
code = self._code_generator.generate_code(prompt)
self._state.last_prompt_used = prompt
return code
_current_agent = Agent(list(dataframes), sandbox=sandbox)
return _current_agent.chat(query)