-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Ollama tool property types must be scalar #342
fix: Ollama tool property types must be scalar #342
Conversation
R/provider-openai.R
Outdated
@@ -290,7 +290,8 @@ method(as_json, list(ProviderOpenAI, TypeObject)) <- function(provider, x) { | |||
names <- names2(x@properties) | |||
properties <- lapply(x@properties, function(x) { | |||
out <- as_json(provider, x) | |||
if (!x@required) { | |||
if (!x@required && provider@api_key != "ollama") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This feels safe but a little hacky. chat_ollama()
calls chat_openai()
and sets api_key = "ollama"
. I briefly looked into adding a new ProviderOllama
that inherits from ProviderOpenAI
until I realized I could simply check here if the provider is Ollama. If you'd rather the ProviderOllama
approach, I could go that way.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I'd rather make a single method(as_json, list(ProviderOllama, TypeObject))
.
And should this warn to let the user know it isn't supported?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a little more context: the reason I backed out of the ProviderOllama
approach is because it involves untangling Ollama and OpenAI all the way up to chat_ollama()
. I'm up for it, but just wanted to call out that it's a bigger change than adding a method.
I agree, a warning would make sense but should probably come from the $register_tool()
call.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I implemented ProviderOllama
(fb07118) but while reading Ollama docs, I realized that Ollama is expecting the required properties in a separate array adjacent to properties
(8ee8a59).
OpenAI, on the other hand, requires all properties to appear in required
, so we still need a separate provider for Ollama.
1365595
to
8ee8a59
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, that looks exactly like what I was expecting 😄
R/provider-ollama.R
Outdated
properties <- lapply(x@properties, as_json, provider = provider) | ||
|
||
# Unlike OpenAI, Ollama uses the `required` field to list required tool args | ||
is_required <- vapply(x@properties, function(t) t@required, logical(1)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You might take a quick look at some of the other provider methods for as_json(, TypeObject)
just in case someone else uses the same pattern (which looks vaguely familiar to me)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, it's basically the same as for ProviderGroq
. I refactored to match the code style. b094638
if (!has_ollama()) { | ||
testthat::skip("ollama not found") | ||
} | ||
|
||
testthat::skip_if_not( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you mind adding a test, even if it's not going to run on CI?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests added in d3b42a1. I added one test around the shape of the tool def and one smoke test that chat_ollama()
still works when a tool with optional arguments is registered.
Close match with `ProviderGroq`
Thanks! Just needs a news bullet and then I'll merge. |
I added one already 😊 (but I may have put it in the wrong place) |
FWIW our convention is to put new bullets at the top, but it doesn't really matter (and will get tweaked prior to releae) |
Problem
Ollama currently doesn't support arrays in the
tools.function.parameters.properties.type
field, instead it expects a scalar. ollama/ollama#5990In ellmer, this means that using
type_*(required = FALSE)
results in the following warning:Edit: This behavior happens in the OpenAI-compatible API, Ollama's API supports
required
as a separate field.After
This PR updates the
as_json
method forProviderOpenAI, TypeObject
to avoid add"null"
to theproperty.type
when the provider is an Ollama model.