Installation Prerequisites for Assertions
Learn about the installation prerequisites for using assertions in your end-to-end tests.
Installation and Configuration Prerequisites
To use assertions in your end-to-end tests, please install the rasa-pro
package and
use a valid license key for Rasa Pro.
Optional Dependency
To evaluate generative assistant responses for relevance and factual accuracy in your end-to-end tests, please install the
optional dependency mlflow
to enable these capabilities.
This dependency uses LLM (Large Language Model) evaluation
to assess the relevance and factual accuracy of the Rasa Pro assistant's generative responses. This LLM is also referred to as
a "LLM-as-Judge" model because it assesses another model's output. In Rasa Pro's use case, the LLM-as-Judge model evaluates whether
the generative response is relevant to the provided input
or whether the generative response is factually accurate in relation to
the provided or extracted ground truth text input.
You can install the dependency using the following commands:
Generative Response LLM Judge Configuration
info
Rasa Pro 3.10 supports only OpenAI models for the LLM Judge model.
By default, the LLM Judge model is configured to use the OpenAI gpt-4o-mini
model
to benefit of the long context window. If you want to use a different model, you can configure the LLM Judge model in
the conftest.yml
file which is a new testing configuration file added in Rasa Pro 3.10. It is automatically
discoverable by Rasa Pro as long as it is placed in the root directory of your assistant project.
Environment Variables
To enable the feature, please set the environment variable RASA_PRO_BETA_E2E_ASSERTIONS
to true
in your testing
environment.