Skip to main content

Intents, FAQs and Utterances automated testing

Instead of manually training and testing intents, utterances, and FAQs, you can auto-generate and test them in bulk. This automation allows you to quickly train the AI-agent on all changes at once, saving time and reducing errors.

The platform will then run a prediction test and display results based on the confidence level. If the confidence level is low, the result will be displayed as a fail.

This guide will walk you through the process of testing intents, FAQs, and utterances in bulk via import.

Test intents

You can upload all intents along with their associated utterances to your AI-agent and test them collectively. With a single click, you can train the AI-agent on all the changes at once. Based on the imported entities, the platform will then run a prediction test and display results based on the confidence level.

To test the intents, follow these steps:

  1. Go to Automation > Test.

Import Intents

  1. Click the highlighted option below to import the entities from the Train section.

  2. Click Sync and test intents to test intents that are out of sync with the Train section.

  • This action will import intents from the Train section.

  1. Choose the intent you want to test.

  2. In the Utterances field, type any utterance and click Generate utterances.

  • A list of utterances will be generated.
  1. Select the utterances that you need to test and click + Add utterances.

  2. Click Test intent.

  • The platform will run a prediction test and display results based on the confidence level. If the confidence level is low, the status of the utterance is displayed as fail.

  1. To test other intents, click the back arrow (on the top left next to the intent's name) and click Test intents.

  • This will test the utterances for all intents and display their status.


  1. Click the Download Report button to download the tested utterances and access them offline.

The downloaded report includes the following details:

  • Utterance - Utterances generated for the intents.
  • Intent - Intents that will be triggered by those utterances.
  • predictedIntent - The intent that was triggered for that utterance.
  • predictedConfidence - The percentage at which the respective intent gets triggered.
  • Result - Outcome of the predictions - success for successful predictions, fail for failed predictions.

Upload Utterances

You can upload a list of test utterances and test it.

To upload utterances, follow these steps:

  1. Click Import entities drop-down and select Upload utterances.

  2. Click the Download icon to download the template.

  3. Fill in the template with the following details:

    • Utterance: The utterance for your intent.

    • Journey: The journey or flow associated with the utterances.

    • Tag: Labels that help recognize or filter important journeys.

  4. Click the Upload file button to upload your template.

  5. Download the report from the Reports section to check the status of the uploaded utterances.

The report includes the following details:

  • Utterance: The utterances that were uploaded.
  • Predicted Journey: The journey to which the utterances were mapped.
  • Journey: The journey specified in the uploaded template.
  • Status: Indicates success or failure, based on the correct mapping of the utterance to the journey.

Test FAQs

To test FAQs, follow these steps:

  1. Click Test FAQ > below-highlighted icon to import the FAQs from train section.

  2. Click Sync and test FAQs.

  • This action will import FAQs from the Train section.
  1. Click on any FAQ.

  2. In the variations field, type any question or text and click Generate variations.

  • A list of variations will be generated.

  1. Select the variations you need to test and click + Add variations.

  • This will add the variations.
  1. Click Test FAQ.

  • The platform will run the prediction test and display results according to the confidence level. The result is pass if any utterance trigger has the same intent with high confidence, else it is fail.

  1. To test other FAQs, click the back arrow (on the top left next to the FAQ's name) and click Test FAQ.

  • This will test the FAQs and display their status.


  1. Click the Download Report button to download the tested FAQs and access them offline.

The downloaded report includes the following details:

  • Utterance - Utterances generated for the intents.
  • Intent - Intents that will be triggered by those utterances.
  • predictedIntent - The intent that was triggered for that utterance.
  • predictedConfidence - The percentage at which the respective intent gets triggered.
  • Result - Outcome of the predictions - success for successful predictions, fail for failed predictions.

Test KB

You can add a list of FAQs and upload it to generate answers based on the document uploaded in the KB section. Note that duplicate questions cannot be created and the answers will not be generated for them. You can include a maximum of 100 questions in an Excel template.

To upload KB FAQs, follow these steps:

  1. Choose Sandbox/Development environment.

  2. Go to Knowledge base.

  3. Click + Add URL. Add your website URL and click Save.

    image

  4. Go to Automation > Test > Test KB.

  5. Click the download icon to download the template.

  6. Fill in the template with the following detail:

    • Questions: Enter the question for which you want to generate the answer.

      drawing
  7. Click Upload questions button to upload your template.

  • A report is generated with a percentage of answered questions.

  1. Download the report from the Reports section to check the status of the uploaded FAQs.

The report includes the following details:

  • Questions: The questions that were uploaded.

  • Answers: The answers are generated for the uploaded questions.

  • Links: Gnerates the links from where the answers are fetched from the uploaded document.