This documentation explains how to use function calls (tools) with Ollama, based on models like Llama3.1 and Mistral. It describes the format of messages, tools, and responses.
A tool is a function that the model can call to obtain information or perform actions. Here is the JSON format of a tool:
{
"type": "function",
"function": {
"name": "function_name",
"description": "Description of what the function does",
"parameters": {
"type": "object",
"properties": {
"param1": {"type": "string", "description": "Description of the parameter"},
"param2": {"type": "number", "description": "Description of the parameter"}
},
"required": ["param1"] // Required parameters
}
}
}
{
"type": "function",
"function": {
"name": "get_current_time",
"description": "Get the current time",
"parameters": {"type": "object", "properties": {}}
}
}
Messages are structured in JSON and must follow a specific order to handle function calls.
system
: Global instructions for the model.user
: User message.assistant
: Model response OR function call.tool
: Tool response after execution.
When the model decides to call a function, it returns a message with the assistant
role and a tool_calls
field:
{
"role": "assistant",
"content": "", // Empty during a function call
"tool_calls": [
{
"function": {
"name": "function_name",
"arguments": "{}" // Arguments in JSON format
}
}
]
}
{
"role": "assistant",
"content": "",
"tool_calls": [
{
"function": {
"name": "get_current_time",
"arguments": "{}"
}
}
]
}
After executing the function, you must add a message with the tool
role to provide the result to the model:
{
"role": "tool",
"name": "function_name",
"content": "Function result"
}
{
"role": "tool",
"name": "get_current_time",
"content": "15:30:45"
}
The model uses the tool result to generate a final response:
{
"role": "assistant",
"content": "Final response based on the tool result"
}
{
"role": "assistant",
"content": "It is currently 15 hours, 30 minutes, and 45 seconds."
}
Here is an example of a complete flow for requesting the current time:
{
"role": "user",
"content": "What time is it?"
}
{
"role": "assistant",
"content": "",
"tool_calls": [
{
"function": {
"name": "get_current_time",
"arguments": "{}"
}
}
]
}
{
"role": "tool",
"name": "get_current_time",
"content": "15:30:45"
}
{
"role": "assistant",
"content": "It is currently 15 hours, 30 minutes, and 45 seconds."
}
Here is an example of a complete program to handle a conversation with a function call:
from ollama import chat
import datetime
# Tool definition
tools = [
{
"type": "function",
"function": {
"name": "get_current_time",
"description": "Get the current time",
"parameters": {"type": "object", "properties": {}}
}
}
]
# Initial conversation
messages = [
{"role": "system", "content": "Use tools when asked for the time."},
{"role": "user", "content": "What time is it?"}
]
# Initial call
response = chat(
model="llama3.1", # Works with mistra (and is better)
messages=messages,
tools=tools
)
def get_current_time():
return datetime.datetime.now().strftime("%H:%M:%S")
# If a function call is detected
if hasattr(response.message, 'tool_calls') and response.message.tool_calls:
# Add the assistant's response to the context
messages.append({"role": "assistant", "content": "", "tool_calls": response.message.tool_calls})
# Fictional tool response
tool_response = {
"role": "tool",
"name": "get_current_time",
"content": f"{get_current_time()}"
}
messages.append(tool_response)
# Final response
final_response = chat(
model="llama3.1",
messages=messages
)
print("Final response:", final_response.message.content)
else:
print("No function call detected.")
- Clear Instructions: Use a
system
message to guide the model. - Argument Validation: Always validate arguments before executing a function.
- Error Handling: Add checks for cases where the model does not call a function.
- Response Format: Ensure tool responses are well-structured.
This repository is constantly evolving.
- Additional Examples: Adding practical cases with more complex tools.
- Advanced Workflows: Examples of nested workflows with multiple tool calls.
- Integrations: Examples of integration with other libraries or services (e.g., external APIs).
This repository is public documentation. You are free to use it as you see fit.