Skip to content

September 4th, 2018

Migrate Your Existing Google DialogFlow Assistant to Rasa

  • portrait of Justina Petraitytė

    Justina Petraitytė

A number of Rasa users have migrated from hosted chatbot development platforms like LUIS, Wit.ai, or the one that we see the highest percentage of users migrating from - DialogFlow (formerly Api.ai). It's safe to say that we can see a trend that more and more developers start building their assistants using hosted platforms like DialogFlow, but eventually, they face development limitations and make a decision to switch to open source solution - Rasa.

Here are the main reasons why developers choose Rasa over the hosted platforms:

  • It's open source.

    • You can customize everything. You can tune the models, integrate new components or even customize the entire framework to best suit your needs.
    • You own your data. You can choose where you want to host your applications and you have a full ownership over who can access the data of your application.
  • Machine learning-powered dialogue management.

    • The assistant uses machine learning to learn the patterns from real conversations. That means no predefined rules, and no state machines.
    • You don't need massive amounts of training data to get started. You can build a scalable assistant bootstrapping from a very small sample of training data.

Migration from DialogFlow to Rasa is one of the most common requests from the Rasa community. This post is going to cover a step-by-step process of migrating an existing DialogFlow assistant to Rasa.

You can find the code of the example assistant used in this tutorial here.

Outline

  1. The DialogFlow assistant
  2. Step 1: Exporting the DialogFlow assistant
  3. Step 2: Training the Rasa NLU model using exported data
  4. Step 3: Migrating contexts and training Rasa Core model
  5. Step 4: Your DialogFlow assistant now runs on Rasa. What's next?
  6. Useful resources

The DialogFlow Assistant

As we said, the goal of this post is to show a step-by-step process for migrating an existing DialogFlow assistant to Rasa.

In order to illustrate the process, we are going to use an example - a custom-built search assistant called 'place_finder', capable of looking for places like restaurants, shops, banks within the user-specified radius, and provide details about the returned place's address, rating and opening hours. The assistant gets all these details by using a webhook to make API calls to the Google Places API.

Below is an example conversation with the place_finder assistant:

place_finder_mockup_5efcceb3b202

If you want to replicate this tutorial, you can find the data files and the webhook code of 'place_finder' assistant inside the dialogflow-assistant directory of the repository of this tutorial. Alternatively, you can follow this guide using your own custom assistant and adjust the steps below to suit your case.

Whichever option you choose, at this point you should have your DialogFlow assistant - now it's time to migrate it to Rasa!

Tip: If you want to play around with place_finder assistant you should get a Google Places API key and place it inside the ga_credentials.yml file of this repository.

Step 1: Exporting the DialogFlow Assistant

We recommend starting with migrating the NLU part of the DialogFlow assistant.

To migrate it to Rasa, all you have to do is export the project files and use them to train the Rasa NLU model - no data or formatting changes are needed. It's designed to be as easy as possible. Here's how.

  1. You can export the project data by navigating to the settings tab of your agent and choosing the 'Export and Import' tab.
dialogflow
  1. On the 'Export and Import' tab choose the option 'Export as ZIP'. This will export the project data as a zip file and download it to your local machine.
dialogflow-export
  1. Once downloaded, you can unzip the package and inspect the files inside. The DialogFlow project directory consists of the following files and directories:
  • entities - a directory which contains json files of the entities your DialogFlow assistant was trained to extract
  • intents - a directory which contains json files of the intents your DialogFlow assistant was trained to understand
  • agent.json - a file which contains the configuration of the DialogFlow assistant
  • package.json - a file which contains the information about the software used to build the assistant
dialogfwlo_structure

If you follow this tutorial step-by-step, you will find the exported DialogFlow data files inside the rasa-assistant/data/place_finder directory of the repo.

You can use these files to train the Rasa NLU model. You only need run a conversion script to convert exported data to Rasa compatible format, define an NLU pipeline configuration file and pass it to the Rasa NLU train script. That's taken care of in Step 2 below. Check out the next step of this tutorial to see the detailed instructions.

Step 2: Training the Rasa NLU model using exported data

Rasa NLU allows full customization of the models. Before you train the NLU model you have to define a configuration of the pipeline. A processing pipeline defines how the training examples are parsed and how the features are extracted. The configuration has to be saved as a .yml file inside your project directory. Below is an example pipeline configuration which you can use:

File: rasa-assistant/config.yml

YAML
language: "en"
pipeline:
- name: "SpacyNLP" # loads the spacy language model
- name: "SpacyTokenizer" # splits the sentence into tokens
- name: "CRFEntityExtractor" # uses the pretrained spacy NER model
- name: "SpacyFeaturizer" # creates sentence vector representations
- name: "SklearnIntentClassifier" # defines the classifier
- name: DucklingHTTPExtractor # uses duckling to parse the numbers
url: http://localhost:8000
dimensions:
- number

Once you define the pipeline, time to train the model:

  1. First, convert the exported data into the Rasa compatible format by running the command below. This function will take the exported data files, convert them to Rasa compatible md format and write it inside the data/nlu.md file:
Shell
rasa data convert nlu --data data/place_finder --out data/nlu.md --format md
  1. To train the model, use the following command which calls the Rasa NLU train function, loads the pipeline configuration and training data files and saves the trained model inside models directory:
Shell
rasa train nlu
  1. You can test the trained model by launching the NLU model in your command line and testing it on various inputs. To do that run:
Shell
rasa shell nlu

The response of the Rasa NLU model includes the results of intent classification and entity extraction. For example, the message 'Hello' was classified as an intent 'Default Welcome Intent' with the confidence of 0.83. Here is a full response returned by the NLU model:

json
{
"intent": {
"name": "Default Welcome Intent",
"confidence": 0.8342492802420313
},
"entities": [],
"intent_ranking": [
{
"name": "Default Welcome Intent",
"confidence": 0.8342492802420313
},
{
"name": "thanks",
"confidence": 0.09805256693160122
},
{
"name": "goodbye",
"confidence": 0.05392708191432759
},
{
"name": "address",
"confidence": 0.003986386948676723
},
{
"name": "place_search",
"confidence": 0.0037102872949153686
},
{
"name": "rating",
"confidence": 0.003059348479049656
},
{
"name": "opening_hours",
"confidence": 0.0030150481893980153
}
],
"text": "Hello"
}

And that's it! You have just migrated the NLU part of the DialogFlow assistant to Rasa!

Step 3: Migrating contexts and training the Rasa Core model

Note: If your custom DialogFlow assistant doesn't use any contexts or webhooks, you can skip this part of the tutorial and go to the step 4 of this tutorial.

If it does, then follow the steps below.

  1. To migrate the remaining part of the assistant - context handling and custom actions - you need some training data.

    DialogFlow performs dialogue management through the concept of 'contexts', while Rasa uses machine learning to predict what actions an assistant should make at the specific state of the conversation based on previous actions and extracted details.

    It means that in order to train the Rasa dialogue model, you need some training data in the form of stories. Since DialogFlow's dialogue management is a rule-based approach, you cannot export any training data which you could use directly to train the Rasa dialogue model.

    The good news is that you have access to conversations history on DialogFlow and you can use it as a basis for generating training data for Rasa Core model.

    Here is an example conversation on DialogFlow:

    conversation-1

    In order to convert this conversation into a Rasa Core training story, you have to convert user inputs to corresponding intents and entities, while the responses of the agent have to be expressed as actions. Here is how the above conversation would look like as a Rasa story:

    File: rasa-assistant/data/stories.md

    markdown
    ### story_01
    * Default Welcome Intent
    utter_greet
    * place_search{"query":"restaurant", "radius":"600"}
    action_place_search
    slots{"place_match":"one"}
    slots{"address":"Ars Vini in Sredzkistrasse 27, Hagenauer Strasse 1, Berlin"}
    slots{"rating":"4.4"}
    slots{"opening_hours":"true"}
    * opening_hours
    utter_opening_hours
    * rating
    utter_rating

    To train the model you will need some additional stories representing different conversation turns.

  2. Create the domain of your assistant which defines all intents, entities, slots, templates, and actions the assistant should be familiar with. For example, the templates of the place_finder assistant look like this:

    File: domain.yml

    YAML
    templates:
    utter_greet:
    - text: "Hello! How can I help?"
    utter_goodbye:
    - text: "Talk to you later!"
    utter_thanks:
    - text: "You are very welcome."
    utter_what_radius:
    - text: "Within what radius?"
    utter_rating:
    - text: "The rating is {rating}."
    utter_address:
    - text: "The address is {address}."
    utter_opening_hours:
    - text: "The place is currently {opening_hours}."
    utter_no_results:
    - text: "Sorry, I couldn't find anything."

    Tip: All intents and entities defined in the domain file must match the names defined in training examples.

  3. Define custom actions. While we can write simple text responses inside the domain file (just like in the domain file snippet above), more complicated actions, like making an API call or connecting to the database to retrieve some data, should be wrapped as a custom action class. In DialogFlow, place_finder had a webhook for retrieving the data from Google Places API so to migrate it to Rasa, you should turn it into a custom action class like the following:

    File: rasa-assistant/actions.py

    This class assigns the name to this custom action, makes the API call, retrieves the requested data, generates a response which should be sent back to the user and sets the details which should be kept throughout the conversation as slots.

  4. That's all you need to train the Rasa Core model which will predict how the assistant should respond to user inputs. Now it's time to train it!

    You can train it by using the following command which will train both NLU and dialogue models and store them as one compressed file inside the models directory.:

    Shell
    rasa train
  5. And this is it: you have successfully migrated an assistant from DialogFlow to Rasa! You can now test it locally. By running the command below you will start the custom actions server:

    Shell
    rasa run actions

    Now, you can load the agent using the command below which will load both Rasa NLU and Rasa Core models and launch the assistant in the console for you to chat:

    Shell
    rasa shell

    *Note: Since we are using the duckling_http as one of the pipeline components, make sure to start docker a docker server for it by running:

    _docker run -p 8000:8000 rasa/duckling_
    

Step 4: Your DialogFlow assistant is now running on Rasa. What's next?

The sky's the limit of what you can do with Rasa-powered assistants. You can customize the models, integrate additional components or connect it to the outside world using the connectors of the most popular messaging platforms. You can even connect them to other cool frameworks and tools to make it even more fun!

If you have migrated your assistant to Rasa, we would love to learn about your experience! Join the Rasa Community Forum and let us know!

Useful resources: