Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM interface & vertexai: add response_schema support, add location parameter and fix some bugs #3268

Merged
merged 3 commits into from
Jan 24, 2025

Conversation

itsmvd
Copy link
Collaborator

@itsmvd itsmvd commented Jan 24, 2025

Adding a response_schema so that users can enforce what the response of the LLM model should look like (using a dataclass or JSON for example).

Change DEFAULT_TOP_K = 1 (common range is 1-40) as <1 is not supported by VertexAI, and will throw errors.

Introduced a location parameter so that users are able to set a specific cloud region.

Copy link
Collaborator

@jkppr jkppr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@jkppr jkppr merged commit d1690f5 into google:master Jan 24, 2025
24 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants