This is an example chatbot demonstrating how to build AI assistants for financial services and banking. This starter pack can be used as a base for your own development or as a reference guide for implementing common banking-industry features with Rasa. It includes pre-built intents, actions, and stories for handling conversation flows like checking spending history and transferring money to another account.
Table of Contents
Run:
pip install -r requirements.txt
To install development dependencies:
pip install -r requirements-dev.txt
pre-commit install
With pre-commit installed, the
black
anddoctoc
hooks will run on everygit commit
. If any changes are made by the hooks, you will need to re-add changed files and re-commit your changes.
Use rasa train
to train a model.
Then, to run, first set up your action server in one terminal window:
rasa run actions
In another window, run the duckling server (for entity extraction):
docker run -p 8000:8000 rasa/duckling
Then to talk to the bot, run:
rasa shell --debug
Note that --debug
mode will produce a lot of output meant to help you understand how the bot is working
under the hood. To simply talk to the bot, you can remove this flag.
data/core.md
- contains stories
data/nlu.md
- contains NLU training data
actions.py
- contains custom action/api code
domain.yml
- the domain file, including bot response templates
config.yml
- training configurations for the NLU pipeline and policy ensemble
tests/e2e.md
- end-to-end test stories
The bot currently has five skills. You can ask it to:
- Transfer money to another person
- Check your earning or spending history (with a specific vendor or overall)
- Answer a question about transfer charges
- Pay a credit card bill
- Tell you your account balance
It also has a limited ability to switch skills mid-transaction and then return to the transaction at hand.
For the purposes of illustration, the bot recognises the following fictional credit card accounts:
emblem
justice bank
credit all
iron bank
It recognises the following payment amounts (besides actual currency amounts):
minimum balance
current balance
It recognises the following vendors (for spending history):
Starbucks
Amazon
Target
You can change any of these by modifying actions.py
and the corresponding NLU data.
If configured, the bot can also hand off to another bot in response to the user asking for handoff. More details on handoff below.
This bot includes a simple skill for handing off the conversation to another bot or a human. This demo relies on this fork of chatroom to work, however you could implement similar behaviour in another channel and then use that instead. See the chatroom README for more details on channel-side configuration.
Using the default set up, the handoff skill enables this kind of conversation with two bots:
The simplest way to use the handoff feature is to do the following:
- Clone chatroom and Helpdesk-Assistant alongside this repo
- In the chatroom repo, install the dependencies:
yarn install
- In the chatroom repo, build and serve chatroom:
yarn build
yarn serve
- In the Helpdesk-Assistant repo, install the dependencies and train a model (see the Helpdesk-Assistant README)
- In the Helpdesk-Assistant repo, run the rasa server and action server at the default ports (shown here for clarity)
In one terminal window:
In another terminal window:
rasa run --enable-api --cors "*" --port 5005 --debug
rasa run actions --port 5055 --debug
- In the Financial-Demo repo (i.e. this repo), run the rasa server and action server at the non-default ports shown below
In one terminal window:
In another terminal window:
rasa run --enable-api --cors "*" --port 5006 --debug
rasa run actions --port 5056 --debug
- Open
chatroom_handoff.html
in a browser to see handoff in action
Using chatroom, the general approach is as follows:
- User asks original bot for a handoff.
- The original bot handles the request and eventually
sends a message with the following custom json payload:
This message is not displayed in the Chatroom window.
{ "handoff_host": "<url of handoff host endpoint>", "title": "<title for bot/channel handed off to>" }
- Chatroom switches the host to the specified
handoff_host
- The original bot no longer receives any messages.
- The handoff host receives the message
/handoff{"from_host":"<original bot url">}
- The handoff host should be configured to respond to this message with something like, "Hi, I'm , how can I help you??"
- The handoff host can send a message in the same format as specified above to hand back to the original bot. In this case the same pattern repeats, but with the roles reversed. It could also hand off to yet another bot/human.
The "try it out" section doesn't require any further configuration; this section is for those who want to change or further understand the set up.
For this demo, the user can ask for a human, but they'll be offered a bot (or bots) instead, so that the conversation looks like this:
For handoff to work, you need at least one "handoff_host". You can specify any number of handoff hosts in the file actions/hanodff_config.yml
.
handoff_hosts:
helpdesk_assistant:
title: "Helpdesk Assistant"
url: "http://localhost:5005"
## you can add more handoff hosts to this list e.g.
# moodbot:
# title: "MoodBot"
# url: "http://localhost:5007"
Handoff hosts can be other locally running rasa bots, or anything that serves responses in the format that chatroom accepts. If a handoff host is not a rasa bot, you will of course want to update the response text to tell the user who/what they are being handed off to.
The Helpdesk-Assistant bot has been set up to handle handoff in exactly the same way as Helpdesk-Assistant, so the simplest way to see handoff in action is to clone Financial-Demo alongside this repo.
If you list other locally running bots as handoff hosts, make sure the ports on which the various rasa servers & action servers are running do not conflict with each other.
You can test the bot on the test conversations by running rasa test
.
This will run end-to-end testing on the conversations in tests/conversation_tests.md
.
Note that if duckling is running when you do this, you'll probably see some "failures" because of entities; that's ok! Since duckling entity extraction is not influenced by NLU training data, and since the values of time
entities depend on when the tests are being run, these have been left unannotated in the conversation tests.
To deploy financial-demo, it is highly recommended to make use of the one line deploy script for Rasa X. As part of the deployment, you'll need to set up git integration to pull in your data and configurations, and build or pull an action server image.
You will need to have docker installed in order to build the action server image. If you haven't made any changes to the action code, you can also use the public image on Dockerhub instead of building it yourself.
See the Dockerfile for what is included in the action server image,
To build the image:
docker build . -t <name of your custom image>:<tag of your custom image>
To test the container locally, you can then run the action server container with:
docker run -p 5055:5055 <name of your custom image>:<tag of your custom image>
Once you have confirmed that the container works as it should, you can push the container image to a registry with docker push
It is recommended to use an automated CI/CD process to keep your action server up to date in a production environment.