notice
This is documentation for Rasa Documentation v2.x, which is no longer actively maintained.
For up-to-date documentation, see the latest version (3.x).
Deploying a Rasa Open Source Assistant in Docker Compose
If you would like to deploy your assistant without Rasa X, you can do so by deploying it in Docker Compose. To deploy Rasa X and your assistant together, see the Recommended Deployment Methods.
Installing Docker
If you're not sure if you have Docker installed, you can check by running:
If Docker is installed on your machine, the output should show you your installed versions of Docker and Docker Compose. If the command doesn't work, you'll have to install Docker. See Docker Installation for details.
Configuring Channels
To run your AI assistant in production, don't forget to configure your required
Messaging and Voice Channels in credentials.yml
. For example, to add a
REST channel, uncomment this section in the credentials.yml
:
The REST channel will open your bot up to incoming requests at the /webhooks/rest/webhook
endpoint.
Using Docker Compose to Run Multiple Services
Docker Compose provides an easy way to run multiple containers together without having to run multiple commands or configure networks. This is essential when you want to deploy an assistant that also has an action server.
Start by creating a file called docker-compose.yml
:
Add the following content to the file:
The file starts with the version of the Docker Compose specification that you
want to use.
Each container is declared as a service
within the docker-compose.yml
.
The first service is the rasa
service, which runs your Rasa server.
To add the action server, add the image of your action server code. To learn how to deploy an action server image, see Building an Action Server Image.
The expose: 5005
is what allows the rasa
service to reach the app
service on that port.
To instruct the rasa
service to send its action requests to that endpoint, add it to your endpoints.yml
:
To run the services configured in your docker-compose.yml
execute:
You should then be able to interact with your bot via requests to port 5005, on the webhook endpoint that corresponds to a configured channel:
Configuring a Tracker Store
By default, all conversations are saved in memory. This means that all conversations are lost as soon as you restart the Rasa server. If you want to persist your conversations, you can use a different Tracker Store.
To add a tracker store to a Docker Compose deployment, you need to add a new
service to your docker-compose.yml
and modify the endpoints.yml
to add
the new tracker store, pointing to your new service. More information about how
to do so can be found in the tracker store documentation: