Call Functions (Tools) Using Mistral Small 3.2 24B Instruct 2506#

You can connect NIM to external tools and services using function calling (also known as tool calling). By providing a list of available functions, NIM can choose to output function arguments for the relevant function(s) which you can execute to augment the prompt with relevant external information.

Function calling is controlled using the tool_choice and tools request parameters.

Inference Request Parameters#

To use function calling, modify the tool_choice and tools parameters.

Parameter

Description

tool_choice

How the model should choose tools. One of: "none", "auto", or a named tool choice. Requires that tools is also set.

tools

The list of tool objects that define the functions the model can call. Requires that tool_choice is also set.

Note

tool_choice can only be set when tools is also set. These parameters work together to define and control the use of tools in the model’s responses. For further information on these parameters and their usage, refer to the OpenAI API documentation.

tool_choice Options#

Select from the following:

  • "none": Disables the use of tools.

  • "auto": Enables the model to decide whether to use tools and which ones to use.

  • Named tool choice: Forces the model to use a specific tool. It must be in the following format:

{
  "type": "function",
  "function": {
    "name": "name of the tool goes here"
  }
}

Note

The type field is optional and defaults to function if not specified.

Examples#

These examples showcase various ways to use function calling with NIM for VLM:

  • Basic Function Calling: Demonstrates how to use a single function with automatic tool choice.

  • Multiple Tools: Shows how to provide multiple tools, including one without parameters.

  • Forced Tool Usage: Illustrates how to force the model to use a specific tool.

Basic Function Calling#

This example shows how to use a single function with automatic tool choice.

from openai import OpenAI

client = OpenAI(base_url="http://0.0.0.0:8000/v1", api_key="not-used")
MODEL_NAME = "mistralai/mistral-small-3.2-24b-instruct-2506"

# Define available function
weather_tool = {
    "type": "function",
    "function": {
        "name": "get_current_weather",
        "description": "Get the current weather",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA"
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use. Infer this from the user's location."
                }
            },
            "required": ["location", "format"]
        }
    }
}

messages = [
    {"role": "user", "content": "What is the weather in San Francisco, CA in Fahrenheit?"}
]

chat_response = client.chat.completions.create(
    model=MODEL_NAME,
    messages=messages,
    tools=[weather_tool],
    tool_choice="auto",
    stream=False
)

assistant_message = chat_response.choices[0].message
messages.append(assistant_message)

print(assistant_message)
# Example output:
# ChatCompletionMessage(content=None, refusal=None, role='assistant', annotations=None, audio=None,
# function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='eLDzCunlo',
# function=Function(arguments='{"location": "San Francisco, CA", "format": "fahrenheit"}',
# name='get_current_weather'), type='function')], reasoning_content=None)

# Simulate external function call
tool_call_result = 88
tool_call_id = assistant_message.tool_calls[0].id
tool_function_name = assistant_message.tool_calls[0].function.name
messages.append({"role": "tool", "content": str(tool_call_result), "tool_call_id": tool_call_id, "name": tool_function_name})

chat_response = client.chat.completions.create(
    model=MODEL_NAME,
    messages=messages,
    tools=[weather_tool],
    tool_choice="auto",
    stream=False
)

assistant_message = chat_response.choices[0].message
print(assistant_message)
# Example output:
# ChatCompletionMessage(content='The current weather in San Francisco, CA is 88°F.',
# refusal=None, role='assistant', annotations=None, audio=None, function_call=None,
# tool_calls=[], reasoning_content=None)

Multiple Tools#

You can also define more than one tool for tools, including tools with no parameters, like the time_tool in the following example:

from openai import OpenAI

client = OpenAI(base_url="http://0.0.0.0:8000/v1", api_key="not-used")
MODEL_NAME = "mistralai/mistral-small-3.2-24b-instruct-2506"

# Same function as in the previous example.
weather_tool = {
    "type": "function",
    "function": {
        "name": "get_current_weather",
        "description": "Get the current weather",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA"
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use. Infer this from the user's location."
                }
            },
            "required": ["location", "format"]
        }
    }
}

time_tool = {
    "type": "function",
    "function": {
        "name": "get_current_time_nyc",
        "description": "Get the current time in NYC.",
        "parameters": {}
    }
}

messages = [
    {"role": "user", "content": "What's the current time in New York?"}
]

chat_response = client.chat.completions.create(
    model=MODEL_NAME,
    messages=messages,
    tools=[weather_tool, time_tool],
    tool_choice="auto",
    stream=False
)

assistant_message = chat_response.choices[0].message
print(assistant_message)
# Example output:
# ChatCompletionMessage(content=None, refusal=None, role='assistant', annotations=None,
# audio=None, function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='nsObFvBWI',
# function=Function(arguments='{}', name='get_current_time_nyc'), type='function')], reasoning_content=None)

# Process tool calls and generate final response as in the previous example

Named Tool Usage#

This example forces the model to use a specific tool.

from openai import OpenAI

client = OpenAI(base_url="http://0.0.0.0:8000/v1", api_key="not-used")
MODEL_NAME = "mistralai/mistral-small-3.2-24b-instruct-2506"

# Same function as in the previous example.
weather_tool = {
    "type": "function",
    "function": {
        "name": "get_current_weather",
        "description": "Get the current weather",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA"
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use. Infer this from the user's location."
                }
            },
            "required": ["location", "format"]
        }
    }
}

chat_response = client.chat.completions.create(
    model=MODEL_NAME,
    messages=[{"role": "user", "content": "What's the weather in New York City like?"}],
    tools=[weather_tool],
    tool_choice={
        "type": "function",
        "function": {
            "name": "get_current_weather"
        }
    },
    stream=False
)

assistant_message = chat_response.choices[0].message
print(assistant_message)
# Example output:
# ChatCompletionMessage(content=None, refusal=None, role='assistant', annotations=None,
# audio=None, function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='rwHw1TR5e',
# function=Function(arguments='{ "location": "New York City, NY", "format": "fahrenheit" }',
# name='get_current_weather'), type='function')], reasoning_content=None)