Part 2 of a four-part series. In this post, you’ll learn how to configure watsonx Assistant to process simple Q&A. 

In this post, we are going to configure watsonx Assistant to process simple Q&A for your chatbot using the sample on this Git project. This sample is a bank’s Virtual Agent chatbot, and its dialog flows are pre-defined. 

Step 1. Understanding watsonx Assistant

watsonx Assistant is IBM’s AI product that lets you build, train, and deploy conversational interactions into any application, device, or channel. You need to train your Watson assistant to recognize “intents” and the “entities” of a user input and define a “dialog” flow that incorporates your Intents. 

As you add training data, a natural language classifier is automatically added to the skill. The classifier model is trained to understand the types of requests that you teach your assistant to listen for and respond to. 

In this series, we are not going to define complicated dialog flows, but if you are interested, here is the official document to which you can refer.

Step 2. Set up watsonx Assistant

In this part, we are going to use this sample from a Git project. We are not going to deploy the sample app here, but we are importing intents, entities, and dialog flows. 

  1. Go to the Git project and download the source code from Clone or download
  2. Open watsonx Assistant from the IBM Cloud dashboard.
  3. Open Skills and click Create skill.
  4. Specify the location of the workspace JSON file in your local copy of the app project and click Import: <project_root>/training/bank_simple_workspace.json
  5. By default, you have an assistant called My first assistant and it is associated with a default skill called My first skill. Open it in the Assistants menu on the left, then Swap skill to the imported Banking_Simple_DO_NOT_DELETE.

Step 3. Understanding the dialog (optional)

It is good to see how dialog flows, intents, and entities are defined for future reference. For instance, an intent named #Business_Information-Contact_Us will be triggered by inputs like the following:

  • Can I email you?
  • Can I talk to someone?
  • I need an SMS number customer service
  • What are your contact details?

Entities are like attributes of Intents. In this case, the entity @contact_type is used to refine user intents and act like a subcategory of #Business_Information-Contact_Us with the following values:

  • address
  • call
  • email
  • SMS

Now, the dialog defines how Watson behaves depending on inputs. In this example, when #Business_Information-Contact_Us is triggered, Watson checks which @contact_type the user prefers and responds accordingly.
 

Step 4. Try it

Let’s try it and see how your assistant behaves. Click Try it on the right top corner and test that it works as expected. In the following image, Watson detected #Business_Information-Contact_Us and @contact_type correctly and responds as expected.

What’s next?

You’ve now configured your Watson to handle sample Q&A. In the next parts of this blog series, we are going to integrate with Node-RED. You can, of course, train Watson however you want and connect with your own app in the future. 

Alternatively, you can add direct integration by clicking Add integration in the assistant menu. 

Disclaimer

IBM is not liable for any damages arising in contract, tort or otherwise from the use of or inability to use this post or any material contained within. All sample code is provided as-is and IBM does not support customization. Do not use the code in production. 

Categories

More from Cloud

Kubernetes version 1.28 now available in IBM Cloud Kubernetes Service

2 min read - We are excited to announce the availability of Kubernetes version 1.28 for your clusters that are running in IBM Cloud Kubernetes Service. This is our 23rd release of Kubernetes. With our Kubernetes service, you can easily upgrade your clusters without the need for deep Kubernetes knowledge. When you deploy new clusters, the default Kubernetes version remains 1.27 (soon to be 1.28); you can also choose to immediately deploy version 1.28. Learn more about deploying clusters here. Kubernetes version 1.28 In…

Temenos brings innovative payments capabilities to IBM Cloud to help banks transform

3 min read - The payments ecosystem is at an inflection point for transformation, and we believe now is the time for change. As banks look to modernize their payments journeys, Temenos Payments Hub has become the first dedicated payments solution to deliver innovative payments capabilities on the IBM Cloud for Financial Services®—an industry-specific platform designed to accelerate financial institutions' digital transformations with security at the forefront. This is the latest initiative in our long history together helping clients transform. With the Temenos Payments…

Foundational models at the edge

7 min read - Foundational models (FMs) are marking the beginning of a new era in machine learning (ML) and artificial intelligence (AI), which is leading to faster development of AI that can be adapted to a wide range of downstream tasks and fine-tuned for an array of applications.  With the increasing importance of processing data where work is being performed, serving AI models at the enterprise edge enables near-real-time predictions, while abiding by data sovereignty and privacy requirements. By combining the IBM watsonx data…

The next wave of payments modernization: Minimizing complexity to elevate customer experience

3 min read - The payments ecosystem is at an inflection point for transformation, especially as we see the rise of disruptive digital entrants who are introducing new payment methods, such as cryptocurrency and central bank digital currencies (CDBC). With more choices for customers, capturing share of wallet is becoming more competitive for traditional banks. This is just one of many examples that show how the payments space has evolved. At the same time, we are increasingly seeing regulators more closely monitor the industry’s…