Server-side tools (OpenAI Chat Completions API)
This guide explains how to call tools in the server-side using the OpenAI Chat Completions API. It presents an example of a tool that returns the weather in a given location.
First, define the tool in the OpenAI Chat Completions API tool format:
const weatherTool = {
type: 'function',
function: {
name: 'get_weather',
description: 'Returns the weather in a given location',
parameters: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The location to get the weather for',
},
},
required: ['location'],
},
},
}
Then, when you call the OpenAI API, include the tool in the tools
array:
import { AiAgentToolkit } from '@tiptap-pro/extension-ai-agent-server'
import { openaiChatCompletionsAdapter } from '@tiptap-pro/extension-ai-agent'
import OpenAI from 'openai'
const toolkit = new AiAgentToolkit()
// Initialize the OpenAI client
const openai = new OpenAI()
// Call the OpenAI Chat Completions API
const response = await openai.chat.completions.create({
model: 'gpt-4.1',
messages: [
{
role: 'system',
content: `
<Your system prompt>
${toolkit.getSystemPrompt()}
`,
},
...llmMessages,
],
// Include the weather tool in the tools array, beside
// the other tools provided by AiAgentToolkit
tools: [...toolkit.getTools(openaiChatCompletionsAdapter), weatherTool],
})
Then, check if the response contains the weather tool. If so, call the tool and add the result to the llmMessages
array.
for (const choice of response.choices) {
if (!choice.message.tool_calls) {
continue
}
for (const toolCall of choice.message.tool_calls) {
const name = toolCall.function.name
if (name !== 'get_weather') {
continue
}
const args = JSON.parse(toolCall.function.arguments)
const result = getWeather(args)
llmMessages.push({
role: 'tool',
content: result.toString(),
tool_call_id: toolCall.id,
})
}
}
Finally, when there are no more server-side tool calls, use the openaiChatCompletionsAdapter
to convert the response to the format expected by the AI Agent extension.
const result = openaiChatCompletionsAdapter.parseResponse(response)
The result
should be the response of the API endpoint, and the return value of the resolver
function.
You can also add chat messages to the result
describing the tool that was called. This will display the result of the tool call in the chat conversation.
// Add the tool call message to the beginning of the list of new chat messages
result.chatMessages.unshift({
type: 'ai',
text: 'The weather in Berlin is sunny.',
metadata: {
// Include this metadata to mark the message as a server-side tool call
// and display it differently in the UI
isServerSideToolCall: true,
},
})