OpenAI Assistant Function Calling
Estimated Time: 4 minutes
This guide covers the specific steps for setting up function calling with OpenAI Assistant resources. For an overview of function calling concepts and use cases, see our Function Calling Guide →.
Function calling with OpenAI Assistants involves:
- Configuring functions on the OpenAI platform
- Importing those functions into Predictable Dialogs
- Setting up endpoints and authentication
Configuring Function Calling
Prerequisites
- OpenAI Assistant with functions configured - Configuring Function Calling in OpenAI
- API Endpoints implemented with HTTP method (
GET
orPOST
) and URL - Authentication (optional) - Bearer token for secure endpoints
For general endpoint design and authentication concepts, see our Function Calling Guide →.
Sign In
- Sign in to Predictable Dialogs.
Navigate to AI Resources
- Hover over your profile picture and select AI Resources from the dropdown.
Select AI Resource
- Click on the AI Resource to configure. You'll see:
- Click the "Functions" button to import functions from your OpenAI Assistant into Predictable Dialogs.
- Click the gear/settings icon next to the desired function to open configuration:
Choose the HTTP Method
-
GET: Sends function arguments as query parameters in the URL.
-
Example:
https://api.open-meteo.com/v1/forecast?current=temperature
-
Predictable Dialogs automatically appends additional parameters provided by the LLM during chat interactions.
-
-
POST: Sends function arguments in the request body (typically JSON format).
-
Enter your endpoint URL. Include static query parameters directly in the URL for
GET
requests. -
Check Add Authentication and enter the bearer token if your endpoint requires authentication.
-
Click Save to finalize.
Testing Function Calls
- Start a chatbot session and input a prompt that triggers the configured function. The chatbot assistant will respond based on the API call.
Reviewing Function Call Details
-
Go to the "Sessions" tab of your agent to view chat interactions.
-
Select the relevant conversation to review details such as:
- Triggered function name and arguments
- Endpoint method and URL
- Endpoint response
- Request duration in milliseconds
Endpoint Authentication
Secure your endpoints with Bearer Token Authentication when necessary.
What is Bearer Token Authentication?
Bearer Token Authentication secures API calls like a secret key:
- Your API provider gives you an API Key.
- Predictable Dialogs uses this key as a Bearer Token to authenticate API calls.
Setup
- Obtain API Key from your developer or API provider.
- In Predictable Dialogs, select Add Authentication and paste the API Key.
Security & Privacy
- Predictable Dialogs securely encrypts and stores your API keys.
When Authentication is Needed
- Required for secured endpoints.
- If authentication isn't needed, leave Add Authentication unchecked.
Contact your developer or API provider if uncertain about authentication requirements.
Related Resources
- Function Calling Overview → - Learn about function calling concepts, use cases, and best practices
- Tools Overview → - Explore all available AI tool capabilities
- OpenAI Functions → - Function calling for OpenAI (coming soon)