Published on

Configuring Function Calling in OpenAI

Authors

OpenAI’s —function calling— lets your AI assistant interact with external tools, APIs, or data sources to perform specific actions. For entrepreneurs, this means you can create assistants that book appointments, fetch real-time data, or even process orders. In this guide, we’ll walk through how to configure function calling using OpenAI’s dashboard.

Adding a Function to Your OpenAI Assistant

OpenAI’s dashboard provides a user-friendly interface to configure functions. Here’s how to do it:

1. Create or Select an Assistant

Navigate to the OpenAI Platform and click Create. Give it a name (e.g., “Sales Bot”) and instructions like “Help customers check product availability and place orders.” Choose a model like GPT-4, which supports function calling.

2. Enable the “Functions” Tool

In the Tools section, click on ** + Functions**. This opens a modal like below, in which you need to enter the function schema. OpenAI functions

3. Use the AI Button to Generate a Function

Instead of manually coding function details, click the Generate button. Describe what you want the function to do in plain language.

Example:
Type: “Check if a product is in stock by searching our inventory database.”

The AI will generate:

  • Function Name: check_product_stock
  • Description: “Check if a product is in stock by searching our inventory database.”
  • Parameters: Product id (text), location (text).
{
  "name": "check_product_stock",
  "description": "Check if a product is in stock by searching our inventory database.",
  "strict": true,
  "parameters": {
    "type": "object",
    "required": [
      "product_id",
      "location"
    ],
    "properties": {
      "product_id": {
        "type": "string",
        "description": "Unique identifier for the product"
      },
      "location": {
        "type": "string",
        "description": "Location of the inventory, if applicable"
      }
    },
    "additionalProperties": false
  }
}

You can refine these suggestions. For instance, add “Color” as another parameter if your product has multiple colors.

Testing and Refining Your Function

After setting up, use the Playground to simulate a conversation:

User: “Is the blue widget in stock?”
Assistant: (Collects needed parameters and calls check_product_stock with product id and location as the parameter, receives “Yes,” and replies accordingly.)
OpenAI functions

5. Implement Your API Endpoint

As the next step, you need to implement your enpoint. This endpoint would then be triggered and the response from the endpoint would be given to the LLM to be used to respond to the chat. The response from your endpoint is important for the LLM to give an appropriate response. You can read this post to understand this more - What Your Endpoint Returns Matters

4. Connect the Endpoint to Your Function

Once the API endpoint is implemented, use the predictable dialogs app to connect your endpoint to the OpenAI function you defined above, to do that you would need the endpoint URL (and optionally the bearer token for authentication).

Best Practices

  1. Start Small: Begin with one function (e.g., fetching weather data) before tackling complex workflows.
  2. Leverage Generate Button: The AI Generate button reduces guesswork — use it to draft function schemas, then tweak the details.