Skip to main content

AI-Assisted Development

AI coding assistants work best when they have the right context. Rasa publishes machine-readable documentation files that you can feed to your AI assistant so it understands Rasa concepts, syntax, and best practices.


llms.txt

The Rasa docs publish an llms.txt file following the llms.txt convention. This is a machine-readable index of every documentation page, organized by section.

FileDescription
llms.txtHierarchical index of all docs with links and descriptions
llms-full.txtFull content of every page in a single file

Use llms.txt when your tool needs to discover available pages. Use llms-full.txt when you want to give an AI assistant access to the entire documentation at once.


Documentation Modules

For more focused context, Rasa provides topic-specific documentation modules. These are self-contained markdown files covering a single area, built from the same source as the docs you're reading now. They're useful when you want to give your AI assistant deep knowledge of a specific topic without loading the entire documentation.

Core

These cover the fundamentals that most Rasa developers need:

ModuleDescription
flows.mdWriting & configuring flows -- the core CALM building block
slots-and-memory.mdSlot types, mappings, and assistant memory
responses.mdResponse syntax, variations, and the contextual rephraser
actions.mdCustom actions, MCP, A2A, action server SDK
configuration.mdDomain file, LLM config, and environment variables
rasa-pro-overview.mdRasa Pro intro + tutorial

Specialized

Install these based on what you're building:

ModuleDescription
patterns.mdConversation repair and deviation handling
enterprise-search.mdRAG pipelines and enterprise search
testing.mdE2E tests, assertions, coverage, and the Inspector
voice.mdVoice assistants with AudioCodes, Twilio, Jambonz, Genesys
deployment.mdDocker, Kubernetes, CI/CD, and load testing
channels.mdSlack, Messenger, Telegram, Twilio, and custom connectors
rasa-studio-overview.mdRasa Studio intro + tutorial

How to Use

Download a .md file and add it to your AI assistant's context. For example, in Cursor or Claude Code, you can reference it directly as a file in your project, or paste its contents into a conversation.

A full index of all available modules is at llms/index.md.