Improving the Bot Framework's IBM Watson NLU Integration through...
A little bit ago, the Bot Builder Community launched a middleware component for integrating IBM Watson's natural language understanding (NLU) with your chatbots. This was meant as a first version of Watson integration, leaving open room for improvement and evolution.
In September, I gave a luncheon talk about the Bot Framework to the Shenandoah Valley Technology Council, and I discussed some of the components being developed by the Bot Builder Community. I offhandedly joked about the emotion detection in the Watson NLU integration, and how you could probably create a psychotherapy bot in just a few lines of code.
After the talk, I started to work on my slide deck for the ValleyTechCon. This talk was going to be an extended version of my luncheon talk, and I really started to think about the therapy bot. I decided to take a weekend, and see how far I could get building out a minimal Rogerian chatbot using just the Bot Framework and the Bot Builder Community's Watson NLU middleware--nothing else.
The result was a novelty Carl Rogers chatbot, but in working on this proof-of-concept to see how far I could get in a weekend, I had to create a number of helper methods to better parse and work with the results from Watson. After my ValleyTechCon talk, I set about folding all of these helper methods (and a few others) into the Bot Builder Community package.
Let's take a look at the emotion helpers. You can find more documentation in the README for the package in the Bot Builder Community repository.
Setting Configuration Values
Watson's NLU allows you to set specific target keywords for emotional analysis. You can do this by setting a configuration value for targets to a string array, such as the code below:
const emotionDetection = new EmotionDetection(WATSON_API_KEY, WATSON_ENDPOINT, WATSON_OPTIONS); emotionDetection.set('targets', ['mercury', 'venus', 'mars']);
Once set, you can access the emotion objects for each of those targets off of the array that gets set in the
const targets = context.turnState.get('emotionTargets');
This will return an array of objects with a
text property and an
emotion object showing properties/scores for each of the 5 emotions that IBM's Watson tracks.
By default, even if targets are set, Watson's NLU will still return emotion detection for the overall document. You can choose to turn this off by setting the document configuration property to
Static Helper Methods
Since emotion detection returns an object of key/value pairs, the static methods folded into the package are designed to better enable you to parse and rank results.
Here is the listing and explanation from the package's README (using TypeScript's syntax to specify the types).
nlup is the alias for the IBM Watson NLU package.
getEmotions(result: nlup.EntitiesResult | nlup.KeywordsResult): nlup.EmotionScores
Takes either an
KeywordsResult object returned from Watson's NLU and returns an
rankEmotionKeys(emotionScores: nlup.EmotionScores): string
EmotionScores objects and returns a string array of the emotion keys (i.e., names of the emotions) in order of relevance.
rankEmotions(emotionScores: nlup.EmotionScores): Emotion
EmotionScores objects and returns a
Emotion array in order of relevance. The
Emotion object is an object with a
topEmotion(emotionScores: nlup.EmotionScores): string
EmotionScores object and returns the name of the emotion that is most relevant.
topEmotionScore(emotionScores: nlup.EmotionScores): Emotion
EmotionScores object and returns an
Emotion object representing the most relevant emotion.
calculateDifference(emotionScores: nlup.EmotionScores, firstEmotion?: string, secondEmotion?: string): number
If called with only the
EmotionScores object, will return the difference between the top two emotions. If the other two parameters are provided, it'll return the difference between the two specified.
calculateVariance(emotionScores: nlup.EmotionScores): number
EmotionScores object and returns the variance between all emotion scores.
Since these are static methods, you'll want to call them directly off of the
EmotionDetection class. For example, to get the top emotion out of an
EmotionScores object returned by Watson, you would use the following:
const emotion: string = EmotionDetection.topEmotion(EMOTIONSCORES_OBJECT);