unsupported value: 'stream' does not support true with this model - gpt-o1-mini via litellm #3455
Open
3 tasks done
Labels
area:configuration
Relates to configuration options
kind:bug
Indicates an unexpected problem or unintended behavior
priority:medium
Indicates medium priority
Before submitting your bug report
Relevant environment info
Description
Hi, my company gives me access to o1-mini model through a litellm gateway, providing an openai compatible API.
When I try to set it in continue.dev config, I always get the following error:
400 litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Only the default (false) value is supported.", 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}} Received Model Group=o1-mini Available Model Group Fallbacks=None
I have set the "stream":false in "completionOptions", as you can see in the config. I feel like this parameter is hardcoded to true in the openai plugin. Please can you check? Thanks.
To reproduce
Log output
The text was updated successfully, but these errors were encountered: