-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Description
** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.
Describe the bug
A clear and concise description of what the bug is.
When creating the prompt to the LLM, ADK/LiteLLM requests structure response using the response_format.response_schema, while Azure AI expects response_format.json_schema .
It results in the following error message:
Error code: 400 - {'error': {'message': "Unknown parameter: 'response_format.response_schema'.", 'type': 'invalid_request_error', 'param': 'response_format.response_schema', 'code':'unknown_parameter'}}
To Reproduce
Please share a minimal code and data to reproduce your problem.
Steps to reproduce the behavior:
- Install 'python-adk' 1.19.0 or 1.20.0
- Run '....'
- Open '....'
- Provide error or stacktrace
DEBUG LiteLLM [92m
POST Request Sent from LiteLLM:
curl -X POST \
https://cs-openai-sc-ccp-cba-01-tst-ds-1a-finbot.openai.azure.com/openai/ \
-H 'api_key: None' -H 'azure_ad_token: None' \
-d '{'model': 'gpt-4o', 'messages': [{'role': 'system', 'content': 'You are an AI agent expert in providing information about the data stored behind the Whoo API.\n\nWhen asked to run a PromQL
query, you must:\n 1. Run the query using the Whoo tool.\n 2. Share the query or any output you receive, and I can help you interpret it.\n\nYou must respond in markdown format.\n\n\nYou are an
agent. Your internal name is "Whoo_agent". The description about you is "You are an AI agent expert in providing information about the data stored behind the Whoo API.\n\nWhen asked to run a
WhooQL query, you must:\n 1. Run the query using the Whoo tool.\n 2. Share the query or any output you receive, and I can help you interpret it.\n\nYou must respond in markdown format.\n".'},
{'role': 'user', 'content': 'hey'}], 'response_format': {'type': 'json_object', 'response_schema': {'description': 'A Response Format model to direct how the model should respond.', 'properties':
{'status': {'default': 'input_required', 'enum': ['input_required', 'completed', 'error'], 'title': 'Status', 'type': 'string'}, 'message': {'title': 'Message', 'type': 'string'}}, 'required':
['message'], 'title': 'ResponseFormat', 'type': 'object'}}, 'extra_body': {}}'
[0m
INFO azure.core.pipeline.policies.http_logging_policy Request URL: 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=REDACTED&resource=REDACTED'
Request method: 'GET'
Request headers:
'User-Agent': 'azsdk-python-identity/1.23.0 Python/3.12.11 (macOS-15.7.2-arm64-arm-64bit)'
No body was attached to the request
2025-12-10T18:16:16 INFO azure.identity._credentials.chained DefaultAzureCredential acquired a token from AzureCliCredential
18:16:17 - LiteLLM:DEBUG: litellm_logging.py:1035 - RAW RESPONSE:
Error code: 400 - {'error': {'message': "Unknown parameter: 'response_format.response_schema'.", 'type': 'invalid_request_error', 'param': 'response_format.response_schema', 'code': 'unknown_parameter'}}
2025-12-10T18:16:17 DEBUG LiteLLM RAW RESPONSE:
Error code: 400 - {'error': {'message': "Unknown parameter: 'response_format.response_schema'.", 'type': 'invalid_request_error', 'param': 'response_format.response_schema', 'code':
'unknown_parameter'}}
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.
18:16:17 - LiteLLM:DEBUG: exception_mapping_utils.py:2357 - Logging Details: logger_fn - None | callable(logger_fn) - False
DEBUG LiteLLM Logging Details: logger_fn - None | callable(logger_fn) - False
18:16:17 - LiteLLM:DEBUG: litellm_logging.py:2602 - Logging Details LiteLLM-Failure Call: []
Expected behavior
A clear and concise description of what you expected to happen.
When the output_schema is commented out, the LLM answers as expected, but it removes the ability to integrate it as an Agent,.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
- OS: macOS
- Python version 3.12.11
- ADK version(pip show google-adk): 1.20.0 & 1.19.0
Model Information:
- Are you using LiteLLM: Yes
- Which model is being used azure-openai-gpt-4o
Additional context
Add any other context about the problem here.