Building agent-driven workflows with the chat API

Last updated: May 27, 2025
Building agent-driven workflows with the chat API

Use the watsonx.ai chat API with foundation models that support tool calling to build agent-driven applications.

Ways to develop

You can build agent-driven workflows by using these programming methods:

Overview

Agentic applications allow a foundation model to function as an agent that controls the flow of interaction with the user. You define the parameters of the interaction, including the tools that the foundation model can use, but you allow the foundation model to decide the next best step based on the current state of the interaction. Tool calling is sometimes referred to as function calling.

Supported foundation models

When you build an agentic workflow, choose a foundation model that meets the following requirements:

  • Handles chat tasks
  • Supports tool calling
  • Can choose the next action

To programmatically get a list of foundation models that support tool calling from the chat API, specify the filters=task_function_calling parameter when you submit a List the available foundation models method request.

See the API reference.

For example:

curl -X GET \
  'https://{region}.ml.cloud.ibm.com/ml/v1/foundation_model_specs?version=2024-10-10&filters=task_function_calling'

REST API

Different foundation models handle tool-calling in different ways. Review the following examples to see some of the differences:

For API method details, see the API reference documentation.

Example tool-calling request

The following example defines two tools, a function for addition and a function for multiplication. The example submits user input to the foundation model and lets the model choose which tool to use to answer the question.

In the following example, replace the {url} variable with the right value for your instance, such as us-south.ml.cloud.ibm.com. Add your own bearer token and project ID.

curl -X POST \
  'https://{url}/ml/v1/text/chat?version=2024-10-08' \
  --header 'Accept: application/json' \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer eyJraWQiOi...'
  --data '{
      "model_id": "mistralai/mistral-large",
      "project_id": "4947c695-a374-428c-acca-332c1a1dc9e9",
      "messages": [
        {
          "role": "user",
          "content": [
            {
              "type": "text",
              "text": "What is 2 plus 4?"
            }
          ]
        }
      ],
      "tools": [
        {
          "type": "function",
          "function": {
            "name": "add",
            "description": "Adds the values a and b to get a sum.",
            "parameters": {
              "type": "object",
              "properties": {
                "a": {
                  "description": "A number value",
                  "type": "float"
                },
                "b": {
                  "description": "A number value",
                  "type": "float"
                }
              },
              "required": [
                "a",
                "b"
              ]
            }
          }
        },
        {
          "type": "function",
          "function": {
          "name": "multiply",
            "description": "Multiplies the values a and b.",
            "parameters": {
              "type": "object",
              "properties": {
                "a": {
                  "description": "A number value",
                  "type": "float"
                },
                "b": {
                  "description": "A number value",
                  "type": "float"
                }
              },
              "required": [
                "a",
                "b"
              ]
            }
          }
        }
      ],
      "tool_choice_option": "auto",
      "max_tokens": 300,
      "time_limit": 1000
      }'

The sample output shows that the model, mistral-large in this case, is able to choose the correct tool to use for the task, the add function.

{
  "id": "chatcmpl-2f47da4026950db321698cb733b25e89",
  "model_id": "mistralai/mistral-large",
  "model": "mistralai/mistral-large",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "tool_calls": [
          {
            "id": "H6KoCbaZV",
            "type": "function",
            "function": {
              "name": "add",
              "arguments": "{\"a\": 2, \"b\": 4}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1739311926,
  "model_version": "2.0.0",
  "created_at": "2025-02-11T22:12:07.243Z",
  "usage": {
    "completion_tokens": 25,
    "prompt_tokens": 189,
    "total_tokens": 214
  },
  "system": {
    "warnings": [
      {
        "message": "This model is a Non-IBM Product governed by a third-party license that may impose use restrictions and other obligations. By using this model you agree to its terms as identified in the following URL.",
        "id": "disclaimer_warning",
        "more_info": "https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-models.html?context=wx"
      }
    ]
  }
}

Example of a tool-calling request with granite-3-2b-instruct

The following request defines two tools, a function for addition and a function for multiplication, and asks the granite-3-2b-instruct foundation model which tool to use.

The Granite models can call tools better when you provide a system prompt with the request. The following system prompt is used:

You are Granite, developed by IBM. You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request.

In the following example, replace the {url} variable with the right value for your instance, such as us-south.ml.cloud.ibm.com. Add your own bearer token and project ID.

curl -X POST \
  'https://{url}/ml/v1/text/chat?version=2024-10-08' \
  --header 'Accept: application/json' \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer eyJraWQiOi...'

The body of the request contains the following JSON snippet:

{
  "model_id": "ibm/granite-3-2b-instruct",
  "project_id": "4947c695-a374-428c-acca-332c1a1dc9e9",
  "messages": [
    {
      "role":"system",
      "content":[
        {
          "type":"text",
          "text":"You are Granite, developed by IBM. You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request."
      }
      ]
    },
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "What is 2 times 4?"
        }
      ]
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "add",
        "description": "Adds the values a and b to get a sum.",
        "parameters": {
          "type": "object",
          "properties": {
            "a": {
              "description": "A number value",
              "type": "float"
            },
            "b": {
              "description": "A number value",
               "type": "float"
            }
          },
          "required": [
            "a",
            "b"
          ]
        }
      }
    },
    {
      "type": "function",
      "function": {
        "name": "multiply",
        "description": "Multiplies the values a and b.",
        "parameters": {
          "type": "object",
          "properties": {
            "a": {
              "description": "A number value",
              "type": "float"
            },
            "b": {
              "description": "A number value",
               "type": "float"
            }
          },
          "required": [
            "a",
            "b"
          ]
        }
      }
    }
  ],
  "tool_choice_option": "auto",
  "max_tokens": 300,
  "time_limit": 10000
}

The granite-3-2b-instruct foundation model is able to choose the correct tool to answer the query.

The response looks as follows:

{
  "id": "chatcmpl-ac80d63a85209b48592435687086e1c2",
  "model_id": "ibm/granite-3-2b-instruct",
  "model": "ibm/granite-3-2b-instruct",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "tool_calls": [
          {
            "id": "chatcmpl-tool-3297286588a24459b0042943cae55629",
            "type": "function",
            "function": {
              "name": "multiply",
              "arguments": "{\"a\": 2, \"b\": 4}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1739311794,
  "model_version": "1.1.0",
  "created_at": "2025-02-11T22:09:55.048Z",
  "usage": {
    "completion_tokens": 28,
    "prompt_tokens": 348,
    "total_tokens": 376
  }
}

Python

See the ModelInference class of the watsonx.ai Python library.

Sample Python notebooks

The following sample notebooks are available:

Learn more

Parent topic: Automating tasks with AI agents