Bot Framework v4 NLP with Orchestrator (PREVIEW) bot sample
This bot has been created using Bot Framework, it shows how to create a bot that relies on multiple LUIS.ai and QnAMaker.ai models for natural language processing (NLP).
Use the Orchestrator dispatch model in cases when:
- Your bot consists of multiple language modules (LUIS + QnA) and you need assistance in routing user's utterances to these modules in order to integrate the different modules into your bot.
- Create a text classification model from text files.
This bot uses Orchestrator to route user utterances to multiple LUIS models and QnA maker services to support multiple conversational scenarios.
OS | Version | Architectures |
---|---|---|
Windows | 10 (1607+) | ia32 (x86), x64 |
MacOS | 10.14+ | x64 |
Linux | Ubuntu 18.04, 20.04 | x64 |
This sample requires prerequisites in order to run.
-
Install latest supported version of Visual C++ Redistributable
-
Install latest Bot Framework Emulator
-
Node.js version 10.14 or higher
> node --version
-
Install BF CLI with Orchestrator plugin
> npm i -g @microsoft/botframework-cli
Make sure bf orchestrator command is working and shows all available orchestrator commands
> bf orchestrator
-
Clone the repository
> git clone https://github.com/microsoft/botbuilder-samples.git
-
CD samples\javascript_nodejs\14.nlp-with-orchestrator
> cd samples\javascript_nodejs\14.nlp-with-orchestrator
-
Configure the LUIS applications (HomeAutomation and Weather) required for this sample.
- Get your LUIS authoring key
> bf luis:build --in CognitiveModels --authoringKey <YOUR-KEY> --botName <YOUR-BOT-NAME>
- Update application settings in
./.env
-
Configure the QnA Maker KB required for this sample.
- Get your QnA Maker Subscription key
> bf qnamaker:build --in CognitiveModels --subscriptionKey <YOUR-KEY> --botName <YOUR-BOT-NAME>
- Update kb information in
./.env
-
Configure Orchestrator to route utterances to LUIS/QnA language services set up above
- Download Orchestrator base model
> mkdir model > bf orchestrator:basemodel:get --out ./model
- Create the Orchestrator snapshot
> mkdir generated > bf orchestrator:create --hierarchical --in ./CognitiveModels --out ./generated --model ./model
The hierarchical flag creates top level intents in the snapshot file derived from the .lu/.qna file names in the input folder. As a result, the example utterances are mapped to HomeAutomation, QnAMaker and Weather intents/labels.
-
Verify .env has the following:
ModelFolder=./model SnapshotFile=./generated/orchestrator.blu
-
Install modules
> npm install
-
Start the bot
> npm start
- Launch Bot Framework Emulator
- File -> Open Bot
- Enter a Bot URL of
http://localhost:3978/api/messages