The RAG voice ingests your docs (paths, URLs, GitHub blobs), chunks and embeds them, then exposes search_knowledge for retrieval. The system prompt requires citations and a polite refusal when nothing relevant is indexed — no fabrication.
import { defineScore, AnthropicProvider } from '@tuttiai/core'
import { RagVoice } from '@tuttiai/rag'
export default defineScore({
provider: new AnthropicProvider(),
agents: {
docs: {
name: 'docs',
model: 'claude-haiku-4-5-20251001',
system_prompt:
'Answer using search_knowledge. Cite the source_id of every fact. If nothing relevant is indexed, say so.',
voices: [
RagVoice({
collection: 'product-docs',
embeddings: { provider: 'openai', api_key: process.env.OPENAI_API_KEY! },
storage: { provider: 'memory' },
}),
],
permissions: ['network'],
},
},
}) - 01 Scaffold a project:
npx tutti-ai init my-docs-bot - 02 Install voices:
npm i @tuttiai/rag - 03 Drop the score above into
tutti.score.ts, set the env vars in.env, and runtutti-ai run.
-
cite source_id of every fact -
refuse when retrieval is empty