Version: 3.x

Deploying a Rasa Assistant in Docker Compose

Installing Docker

If you're not sure if you have Docker installed, you can check by running:

docker -v && docker-compose -v
# Docker version 18.09.2, build 6247962
# docker-compose version 1.23.2, build 1110ad01

If Docker is installed on your machine, the output should show you your installed versions of Docker and Docker Compose. If the command doesn't work, you'll have to install Docker. See Docker Installation for details.

Configuring Channels

To run your AI assistant in production, don't forget to configure your required Messaging and Voice Channels in credentials.yml. For example, to add a REST channel, uncomment this section in the credentials.yml:

rest:
# you don't need to provide anything here - this channel doesn't
# require any credentials

The REST channel will open your bot up to incoming requests at the /webhooks/rest/webhook endpoint.

Using Docker Compose to Run Multiple Services

Docker Compose provides an easy way to run multiple containers together without having to run multiple commands or configure networks. This is essential when you want to deploy an assistant that also has an action server.

Start by creating a file called docker-compose.yml:

touch docker-compose.yml

Add the following content to the file:

version: '3.0'
services:
rasa:
image: rasa/rasa:3.6.20-full
ports:
- 5005:5005
volumes:
- ./:/app
command:
- run

The file starts with the version of the Docker Compose specification that you want to use. Each container is declared as a service within the docker-compose.yml. The first service is the rasa service, which runs your Rasa server.

To add the action server, add the image of your action server code. To learn how to deploy an action server image, see Building an Action Server Image.

version: '3.0'
services:
rasa:
image: rasa/rasa:3.6.20-full
ports:
- 5005:5005
volumes:
- ./:/app
command:
- run
app:
image: <image:tag>
expose: 5055

The expose: 5005 is what allows the rasa service to reach the app service on that port. To instruct the rasa service to send its action requests to that endpoint, add it to your endpoints.yml:

action_endpoint:
url: http://app:5055/webhook

To run the services configured in your docker-compose.yml execute:

docker-compose up

You should then be able to interact with your bot via requests to port 5005, on the webhook endpoint that corresponds to a configured channel:

curl -XPOST http://localhost:5005/webhooks/rest/webhook \
-H "Content-type: application/json" \
-d '{"sender": "test", "message": "hello"}'

Configuring a Tracker Store

By default, all conversations are saved in memory. This means that all conversations are lost as soon as you restart the Rasa server. If you want to persist your conversations, you can use a different Tracker Store.

To add a tracker store to a Docker Compose deployment, you need to add a new service to your docker-compose.yml and modify the endpoints.yml to add the new tracker store, pointing to your new service. More information about how to do so can be found in the tracker store documentation: