Virtual Agent (VA) Designer and Natural Language Understanding (NLU) Model Builder at a glance
do you think that 10 minutes from now you'll have the skills to build an intelligent and engaging conversation for a chatbot one that naturally understands people using simple everyday language well in this demo i'm going to show you exactly how servicenow's virtual agent designer and natural language understanding model builder will deliver that ability but first this presentation may contain forward-looking statements that involve various risks and uncertainties actual results may differ materially and should not be relied upon in making purchasing decisions because together these tools enable you to start quickly with pre-built conversation templates and reusable topics for the most common it customer service and hr requests you can also completely design and build new conversations using a visual canvas and drag and drop functionality so you can create without writing any code and you can visually design and tune nlu models so employees and customers can use that simple everyday language in their request to virtual agent we're going to start in virtual agent designer but before we dive into editing we're looking at a list of existing conversation topics now i've loaded a bunch of examples for you but you'll find the most common it customer service and hr service delivery topics are all available for you out of the box you can enable these out-of-the-box conversations in a few minutes or quickly customize them for your own organization's needs but either way you're not starting from scratch or a blank sheet of paper and as you start to work with conversations you'll find that a lot of them reuse the same logic for common actions things like searching across a knowledge base or transferring to a live agent so we have this concept of topic blocks which are stackable pieces of conversations that can be inserted into another topic so you can build the logic just once and then call that topic block from other topics this helps ensure consistency for your users and also makes conversations a lot easier to manage now let's look at how you can edit a conversation topic i have a sample ready that enables a user to search for a specific type of car that they might want the properties page enables you to define how the topic works with nlu and when it should be available but the flow of the conversation is here in the editor you can see the center of the canvas has a visual flow that represents all of the logic for how the conversation will work and on the left side you have all the user inputs which are the different types of prompts for the user the bot responses that virtual agent can send back to the user and utilities for looking up or creating records making decisions and adding multiple paths to the conversation or even running scripts of your own there's also a table view of the conversation flow that's available if you prefer to build that way it's completely up to the designer and when you're working on a conversation you can quickly test it and get a preview of what users will see so you can build and iterate as fast as you want to go now hopefully you noticed we have a bunch of these blue user input prompts on the canvas asking for the color model and year but in the test you just saw how natural language understanding extracted all those details from my request and plug them in as answers without ever prompting me and that's driven by these options in the lower right corner to enable nlu input and also skip confirmation from the user very cool stuff so let's make an adjustment to our conversation and drag a yes no prompt that will be shown after the results we'll use it to ask if the user would like to speak with an agent about the vehicle that came back and we'll need to act differently based on their response so let's rename this answer and specify a condition that the conversation should only go this way if the user says yes and if they do say yes we'll call one of those reusable topic blocks that includes all of the logic needed to route to a live agent and the last thing we need to do is finish the other branch so if the user doesn't want to talk to an agent we'll simply end the conversation and we can do that by dragging our connection point over to the end and we can zoom back out to get a view of the entire flow alright that's all there is to editing a conversation let's switch over to nou model builder where the configuration for the natural language understanding piece happens now building an nlu model just sounds complex and even a bit intimidating if you've never done it before but let me show you how easy this is for anyone because it's a completely visual experience models are trained by entering a few sample phrases of what you expect users to say and then highlighting the different parts that can be used to answer questions without having to prompt the user like we saw earlier in our test it's a technical term but these answers or additional details that support the request are called entities let's add a new phrase here to help someone find a minivan next we'll label the car color and model as entities because they provide additional detail about what the user wants and we can connect them to now platform data after adding a new phrase or utterance we need to incorporate these changes by training the model before we can test them and testing is integrated directly into nlu model builder so you can quickly validate the accuracy of your changes before they're published to users run a test by simply typing a phrase you think users will send to virtual agent and then review the results in our case the phrase matches the intent of searching for a vehicle and has also identified that a specific model is included you can also see that our newly trained model is a better match than the published one currently serving user requests in production in fact the published model wouldn't have even recognized that a user wanted to search for a vehicle based on the test phrase we used but the additional training phrase we added has fixed this the last thing i want to show is how nlu model builder enables you to deliver consistent natural experiences for users with its flexible and powerful methods for identifying entities and synonyms entities can be automatically recognized by built-in support for common details like dates and locations they can be manual lists that you create or you can even use regular expression patterns for advanced matching and the vocabulary of each model can include synonyms so you can take a base word that appears in a lot of your training phrases and quickly enable support for different variations of the same term so for example our training phrases might all say car but we know users might say vehicle or automobile in their own requests and we want those terms treated the same way as car so let's recap what you just saw with virtual agent designer and nlu model builder they enable you to start quickly with pre-built conversation templates and reusable topics you can also completely design and build conversations for virtual agent without writing any code and you can visually create and tune nlu models so employees and customers can use simple everyday language in their requests be sure to check out servicenow.com for more information about how to use virtual agent designer and natural language understanding model builder to deliver insanely great user experiences
https://www.youtube.com/watch?v=dKXTEJhMXAY