Provide Context to the LLM
When you generate suggestions with Tiptap Content AI cloud, the following information is sent to the LLM:
- The editor's content.
- The extension's rules.
- The
context
option (if provided).
The context
option is an optional string property that can be used to provide extra information to the LLM. This information will be added to the prompt. It is global, so it will be used with all rules.
Configuring the Context's Initial Value
You can provide an intial value of the context in the extension's config.
AiSuggestion.configure({
context: 'The tone should be formal and professional.',
})
Updating the Context
You can update the value of the context at any time with the setAiSuggestionContext
command. This is useful if you want to change the context based on user input or other application-specific logic.
const newContext = 'The tone should be informal and friendly.'
editor.commands.setAiSuggestionContext(newContext)
This command will not automatically reload the suggestions. You need to call the loadAiSuggestions
command to update the suggestions based on the new context. A common pattern is to chain the two commands together.
editor.chain().setAiSuggestionContext(newContext).loadAiSuggestions().run()
To clear the context, set it to null
.
editor.commands.setAiSuggestionContext(null)
Reading the Current Context
You can access the current value of the context with the context
storage property.
const currentContext = editor.extensionStorage.aiSuggestion.context