The Rasa Blog
Product Updates
Check out the latest and greatest of our product updates including feature updates, updates to our product strategy, and deep dives into existing product capabilities.
July 22nd, 2020
GPT-3: Careful First Impressions
Vincent Warmerdam
If you've been following NLP Twitter recently, you've probably noticed that people have been talking about this new tool called GPT-3 from OpenAI. It's a big model with 175 billion parameters, and it's considered a milestone due to the quality of the text it can generate.
July 9th, 2020
Introducing Rasa NLU Examples
Vincent Warmerdam
Conversational AI is an experimental field. This is in part because the field is relatively new but also because no chatbot is the same.
May 11th, 2020
New: Improvements to the NLU Inbox
Rasa
With Rasa X 0.27.0 and 0.28.0, we’ve released improvements to the NLU Inbox that make it easier to review and annotate incoming messages.
May 7th, 2020
Conversation-Driven Development
Alan Nichol
Building a common vocabulary for the process of listening to your users and using those insights to improve your AI assistant.
March 12th, 2020
Reviewing Conversations in Rasa X 0.26
Stacee Ballback
New features in Rasa X 0.26 including conversation tagging and additional filters, make it easier than ever to review conversations & improve your AI assistant with Rasa X.
March 11th, 2020
How to Migrate Your Assistant to Rasa X (the Easy Way)
Karen White
Learn how to get started using Rasa X when you already have an assistant running in production, using the rasa export CLI command released with Rasa Open Source 1.8.0.
March 9th, 2020
Introducing DIET: state-of-the-art architecture that outperforms fine-tuning BERT and is 6X faster to train
Mady Mantha
With Rasa 1.8, our research team is releasing a new state-of-the-art lightweight, multitask transformer architecture for NLU: Dual Intent and Entity Transformer (DIET).
March 6th, 2020
Connect in Fewer Steps: What’s New with Integrated Version Control
Rasa
With Rasa X 0.26.0, we’ve added new features to Integrated Version Control that simplify the setup process and give teams greater visibility into changes.
February 28th, 2020
Using Conversation Tags to Measure Carbon bot’s Success Rate
Alan Nichol
Learn how we're using conversation tags to track Carbon bot's success rate. We'll also preview what’s next for conversation tags in the upcoming 0.26.0 release of Rasa X.
February 21st, 2020
Unpacking the TED Policy in Rasa Open Source
Karen White
We’ll explore dialogue management by taking a close look at one of the machine learning policies used in Rasa Open Source: The Transformer Embedding Dialogue Policy, or TED.
January 28th, 2020
Train on Larger Datasets Using Less Memory with Sparse Features
Tanja Bunk
With Rasa 1.6.0 we released sparse features for the NLU pipeline. In this short blog post, we explain what sparse features are and show that we now can train on larger datasets using less memory.
January 16th, 2020
Integrated Version Control: Linking Rasa X with Git-based Development Workflows
Rasa
As of Rasa X version 0.23.0, we’ve added a new feature that allows developers to version training data by connecting Rasa X with a Git repository on a remote server.
December 17th, 2019
Rasa Open Source + Rasa X: Better Together
Rasa
We recently launched Rasa X, a free toolset that helps you quickly iterate on and improve the quality of your contextual assistant built using Rasa Open Source.
December 2nd, 2019
NLP vs. NLU: What's the Difference and Why Does it Matter?
Rasa
The terms NLP and NLU are often used interchangeably, but they have slightly different meanings. Learn the difference between natural language processing and natural language understanding and why they're important for successful conversational applications.
November 26th, 2019
You May Not Need to Fine-tune: ConveRT Featurizer Makes Sentence Representations More Efficient
Daksh Varshneya
Introducing a new featurizer based on a recently proposed sentence encoding model, ConveRT. We explain how you can use it in Rasa to get very strong performance with a model that trains in minutes on a CPU.