Recipes · Support

Answer questions from your docs, cite the source.

Ingest your docs into a vector store and answer with citations. Says "I don't know" when retrieval is empty.

Chihab · · Beginner · 7 min read

The RAG voice ingests your docs (paths, URLs, GitHub blobs), chunks and embeds them, then exposes search_knowledge for retrieval. The system prompt requires citations and a polite refusal when nothing relevant is indexed — no fabrication.

Score file
import { defineScore, AnthropicProvider } from '@tuttiai/core'
import { RagVoice } from '@tuttiai/rag'

export default defineScore({
  provider: new AnthropicProvider(),
  agents: {
    docs: {
      name: 'docs',
      model: 'claude-haiku-4-5-20251001',
      system_prompt:
        'Answer using search_knowledge. Cite the source_id of every fact. If nothing relevant is indexed, say so.',
      voices: [
        RagVoice({
          collection: 'product-docs',
          embeddings: { provider: 'openai', api_key: process.env.OPENAI_API_KEY! },
          storage: { provider: 'memory' },
        }),
      ],
      permissions: ['network'],
    },
  },
})
How to run it
  1. 01 Scaffold a project: npx tutti-ai init my-docs-bot
  2. 02 Install voices: npm i @tuttiai/rag
  3. 03 Drop the score above into tutti.score.ts, set the env vars in .env, and run tutti-ai run.
Why this is safe
  • cite source_id of every fact
  • refuse when retrieval is empty
Related

More like this.

Start conducting.

One install. Your first agent running in 60 seconds. No signup. No telemetry.