Reading the Document
The AI Agent reads the document in chunks to handle documents of any size efficiently. This chunking mechanism is essential because:
- Large documents may exceed the context window of the LLM
- Editing smaller chunks is more efficient
- It allows the AI Agent to focus on specific parts of the document
Chunking mechanism
By default, the document is split into chunks of approximately 1000 characters each, while preserving HTML structure. The AI Agent maintains a pointer to the current chunk and can navigate through the document using specific tools.
// The default chunk size is 1000 characters
// You can customize it when creating the provider
const provider = new AiAgentProvider({
// ...other options
chunkSize: 2000, // Larger chunks (2000 characters)
})
Custom chunking
You can customize how the document is chunked by providing a custom chunkHtml
function:
const provider = new AiAgentProvider({
// ...other options
chunkSize: 1000,
chunkHtml: ({ html, chunkSize }) => {
// Custom logic to split HTML into chunks
// Must return an array of HTML strings
return customSplitFunction(html, chunkSize)
},
})