Building a bot using Rasa NLU & Core

Akhil C K
6 min readMay 11, 2021

How about building your own assistant which will reply for the conversations it is trained for. Interesting?

Okay, let’s learn to build.

So In this blogpost, We’ll build a conversational bot on Rasa stack which will interact with you and it can provide random jokes if you ask.

Requirements

Rasa Stack : It is a set of open source machine learning tools for developers to create contextual AI assistants and chatbots and the leading open source machine learning toolkit that allow developers expand bots beyond answering simple questions with minimal training data. It has two frameworks, NLU & Core.

Fig 1. rasa-stack

Rasa NLU: Framework for natural language understanding with intent classification and entity extraction. This helps the chatbot to understand what the user is saying by analyzing the intent.

Rasa Core: It is a chatbot framework with machine learning-based dialogue management that evaluates the next action based on the input from NLU, the conversation context, and the training data(conversational stories).

— Language model: It is going to be used to analyze incoming text messages and extract the necessary information. We will be using SpaCy language model.

Installations

We need to install Rasa NLU, Rasa Core and a spaCy language model. To install, using pip

#NLU
python -m pip install rasa_nlu[spacy]
#Core
python -m pip install -U rasa_core == 0.9.6
#Language Model
python -m spacy download en_core_web_md
python -m spacy link en_core_web_md en --force;

Preparing Training Data

Training data consists of a list of text messages that one expects to receive from the bot. This data is labelled with the intent and entities that Rasa NLU should learn to extract. Let us look in to the concept of intent and entities with an example.

  • Intent: The intent describes class/category of messages. For example, in our bot, the sentence :Tell me a joke” has a joke intent.
  • Entity: Pieces of information which help the chatbot to understand what specifically a user is asking about by recognising the structured data in the sentence message. For example: Send invitation to abc@xyz.com Inorder to send email it should extract email entity. (abc@xyz.com)

Here is some portion of training data. We can also add some spelling mistakes/slangs since that will make chat bot speak like human. We will save this under data/nlu_data.md

## intent:greet
- Hi
- Hey
- Hi bot
- Hey bot
- Hello
## intent:thanks
- Thanks
- Thank you
- Thank you so much
- Thanks bot
- Thanks for that
- cheers
- cheers bro
## intent:name
- My name is [Juste](name)
- I am [Josh](name)
- I'm [Lucy](name)
- People call me [Greg](name)
- It's [David](name)
- Usually people call me [Amy](name)
- My name is [John](name)
- You can call me [Sam](name)
- Please call me [Linda](name)
## intent:joke
- Can you tell me a joke
- I would like to hear a joke
- Tell me a joke
- A joke please
- Tell me a joke please
- I would like to hear a joke
- I would loke to hear a joke, please
- Can you tell jokes?
- Please tell me a joke
- I need to hear a joke

Furthermore, we need configuration file, nlu_config.yml, for the NLU model:

language: "en"
pipeline: spacy_sklearn

We need to add makefile. It contains a set of directives used by a make build automation tool to generate a target. Add below content to the makefile.

.PHONY: clean train-nlu train-core cmdline server
TEST_PATH=./
help:
@echo " clean"
@echo " Remove python artifacts and build artifacts."
@echo " train-nlu"
@echo " Trains new nlu model using the projects Rasa NLU config"
@echo " train-core"
@echo " Trains new dialogue model using the story training data"
@echo " action-server"
@echo " Starts the server for custom action."
@echo " cmdline"
@echo " Will load the assistant in your terminal for you to chat."
clean:
find . -name '*.pyc' -exec rm -f {} +
find . -name '*.pyo' -exec rm -f {} +
find . -name '*~' -exec rm -f {} +
rm -rf build/
rm -rf dist/
rm -rf *.egg-info
rm -rf docs/_build
train-nlu:
python -m rasa_nlu.train -c nlu_config.yml --data data/nlu_data.md -o models --fixed_model_name nlu --project current --verbose
train-core:
python -m rasa_core.train -d domain.yml -s data/stories.md -o models/current/dialogue -c policies.yml
cmdline:
python -m rasa_core.run -d models/current/dialogue -u models/current/nlu --endpoints endpoints.yml
action-server:
python -m rasa_core_sdk.endpoint --actions actions

We can now train the NLU using our training data(make sure to install Rasa NLU , as well as spaCy). Inorder to start training feed below command.

make train-nlu
Fig 2. training-flu

The trained model files will be stored at: ‘./models/nlu/current’.

Rasa — Core

After completing the nlu training bot is capable of understanding what the user is saying . Now the next thing is to make the chatbot respond to messages. In our case, it would be to fetching random joke through api and give it to the user. We will teach bot to make responses by training a dialogue model using Rasa Core.

Writing Stories

The training data for dialogue models is called stories. A story is an actual piece of conversation that takes place between a user and Bot. The user ’s inputs are intents as well as corresponding entities, and bot responses are expressed as actions.

Here is some portion of training data, that I have prepared and we will store this under data/stories.md

## story_goodbye
* goodbye
- utter_goodbye
## story_thanks
* thanks
- utter_thanks
## story_name
* name{"name":"Sam"}
- utter_greet
## story_joke_01
* joke
- action_joke
## story_joke_02
* greet
- utter_name
* name{"name":"Lucy"}
- utter_greet
* joke
- action_joke
* thanks
- utter_thanks
* goodbye
- utter_goodbye

Defining the Domain

Domain includes what user inputs it should expect to get, what actions it should predict, how to respond and what things to store. The domain consists of intents, slots, entities, actions and templates. We discussed the intents and entities, let’s understand the others.

  • slots: those are like placeholders for the values that enable the chatbot to keep a track of the conversation.
  • actions: the actions made by the chatbot.
  • templates: template texts for the things that bot would respond

Next, we’ll define the domain domain.yml . Here is an example domain for our bot:

intents:
- greet
- goodbye
- thanks
- deny
- joke
- name
entities:
- name
slots:
name:
type: text
actions:
- utter_name
- utter_thanks
- utter_greet
- utter_goodbye
- action_joke
templates:
utter_name:
- text: "Hey there! Tell me your name."
utter_greet:
- text: "Nice to you meet you {name}. How can I help?"
utter_goodbye:
- text: "Talk to you later!"
utter_thanks:
- text: "My pleasure."

Actions

Since we want our Bot to make an API call to retrieve random jokes , we need to create a custom action for this purpose. Add below code in actions.py

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import logging
import requests
import json
from rasa_core_sdk import Action
logger = logging.getLogger(__name__)
class ActionJoke(Action):
def name(self):
return "action_joke"
def run(self, dispatcher, tracker, domain):
request = json.loads(requests.get('https://api.chucknorris.io/jokes/random').text)
joke = request['value']
dispatcher.utter_message(joke)
return []

After preparing the training data, it is the time to train rasa-core. To initiate the training we can feed the following command.

make train-core
Fig 3. training-core

The trained model will be saved at :models/dialogue

So it is time to chat with our bot. I will be checking one simple conversational story that the bot has trained to respond. To start the conversation in the command line, feed following command

make cmdline
Fig 4. cmdline

Conclusion

In this blog post we have created a bot that is capable of listening to user’s input and responding contextually. We have used the Rasa NLU and Rasa Core to create it with minimum training data. So it is your turn! Start creating a bot for your use case with Rasa stack.

--

--