Skip to content

Which local LLM models can be used for kernel function calls. #9942

Closed Answered by toresenneseth
Vizeo asked this question in Q&A
Discussion options

You must be logged in to vote

Hi,

Non-expert doing exploration here. I had a similar problem and came across this overview: https://learn.microsoft.com/en-us/semantic-kernel/concepts/ai-services/chat-completion/function-calling/function-choice-behaviors?pivots=programming-language-csharp#supported-ai-connectors

Not sure if it solves your issue, though.

Also, I got functions working with Ollama even though the table in the link above says it's not supported yet. I'm using NuGet package Microsoft.SemanticKernel.Connectors.Ollama 1.32.0-alpha

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@Vizeo
Comment options

Answer selected by sophialagerkranspandey
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants