Get started with the Vercel AI SDK
The Vercel AI SDK allows you to build an AI Agent with multiple AI model providers. It lets you switch between different AI model providers, so you can try them and see which one works best for you.
Client-side setup
First, install the AI Agent extension.
npm install @tiptap-pro/extension-ai-agent
Then, import the extension and configure it with the AiAgentProvider
class.
import { Editor } from '@tiptap/core'
import StarterKit from '@tiptap/starter-kit'
import AiAgent, { AiAgentProvider } from '@tiptap-pro/extension-ai-agent'
const provider = new AiAgentProvider()
const editor = new Editor({
extensions: [
StarterKit,
AiAgent.configure({
provider,
}),
],
})
Inside the AI Agent provider, define a resolver
function that calls your backend.
Also define an adapter
function that converts the chat messages to the format expected by the Vercel AI SDK.
import AiAgent, { AiAgentProvider, vercelAiSdkAdapter } from '@tiptap-pro/extension-ai-agent'
const provider = new AiAgentProvider({
adapter: vercelAiSdkAdapter,
// The llmMessages property contains the chat messages in the format expected by the Vercel AI SDK
resolver: async ({ llmMessages }) => {
// Call the API endpoint of your backend
const response = await fetch('/api-endpoint', {
method: 'POST',
body: JSON.stringify({ llmMessages }),
})
return await response.json()
},
})
In the next section, we'll see how to implement the API endpoint that returns the response in the correct format.
Server-side setup
First, install the AI Agent, Vercel AI SDK, and your preferred AI provider (in this example, we use OpenAI).
npm install @tiptap-pro/extension-ai-agent @tiptap-pro/extension-ai-agent-server ai @ai-sdk/openai
Then, inside your API endpoint, create an AiAgentToolkit
instance. It lets you configure the tools that will be available to the AI model.
import { AiAgentToolkit } from '@tiptap-pro/extension-ai-agent-server'
const toolkit = new AiAgentToolkit()
After creating the toolkit, send the request to your AI provider using the Vercel AI SDK.
import { AiAgentToolkit } from '@tiptap-pro/extension-ai-agent-server'
import { vercelAiSdkAdapter } from '@tiptap-pro/extension-ai-agent'
import { generateObject } from 'ai'
import { openai } from '@ai-sdk/openai'
const toolkit = new AiAgentToolkit()
// Call the Vercel AI SDK
const response = await generateText({
model: openai('gpt-4.1'),
messages: [
{
role: 'system',
content: `
<Your system prompt>
${toolkit.getSystemPrompt()}
`,
},
...llmMessages,
],
// Provide the tools that the AI model can call
tools: toolkit.getTools(vercelAiSdkAdapter),
})
At the end of the system prompt, include the system prompt generated by the AiAgentToolkit
instance, like this: toolkit.getSystemPrompt()
. This contains instructions on how to use the tools.
To write the system prompt, see the system prompt guide. It includes an example system prompt that you can use as a starting point.
Finally, use vercelAiSdkAdapter
to convert the response to the format expected by the AI Agent extension.
const result = vercelAiSdkAdapter.parseResponse(response)
The result
should be the response of the API endpoint, and the return value of the resolver
function.