This section covers the AI Assertor tool, an LLM-powered message analyzer that uses your LLM provider to determine whether a message satisfies one or more assertions or requirements as written by you using natural language. It is most commonly connected to a tool that sends or receives messages in order to verify the content of those messages. Since it affects performance and the ability to generate load, the AI Assertor tool is skipped when used with Load Test.

This tool requires a validate license. Additionally, you must configure your LLM provider account information in the LLM Provider preferences. OpenAI, Azure OpenAI, and other LLM providers with similar functionality are supported. See LLM Provider preferences to determine whether your LLM provider can be used. Be aware that LLMs can provide inaccurate information, which can affect this feature. Due diligence is recommended.

Understanding the AI Assertor

An AI Assertor is commonly chained to a SOAP or REST Client or a Messaging Client as an output to a test to verify a payload or traffic, but it can also be created as a stand-alone test. The AI Assertor provides support for complex validation needs using the adaptability of AI through your LLM provider.

The AI Assertor is designed to support multiple validation assertions to create complex verifications. It can be added via the Add Output wizard or as a stand-alone Standard test.

Configuring AI Assertor

The AI Assertor's tool settings consist of the following tabs:

  • Configuration: This tab is used to create and configure assertions for the LLM.
  • Expected Document: Specifies the expected document, creating a template from which you can select elements. If AI Assertor receives a valid document (for example, from traffic or from the tool it is attached to), this panel will be populated automatically. Alternatively, you can copy a sample message into tab. Note that the expected document does not get saved by default; if you want to save it, enable Save expected document.
  • Input: Specifies either text or a file on which the tool should operate. If the AI Assertor is chained to another tool, this tab does not appear, and the AI Assertor uses the output of that tool as its input.

To configure AI Assertor:

  1. Click Add in the AI Assertor’s Configuration tab. A new assertion will be added.

  2. Assertions are automatically assigned a name, but you can change this to make it easier to track what multiple prompts are meant to do.
  3. Enter the assertion you want the AI to validate in the AI Assertor Prompt field.

    Best Practices

    The best results are generally achieved when assertions are short and clear. If you have multiple conditions that you want to validate, it is recommended that you create multiple short assertions rather than one longer one. Not only will this be less likely to confuse the AI, but it will also make targeted changes to your assertions easier.

    In addition, there is no need to request that the AI "check" or "verify" anything. Simply state the expectation, for example, "Cart has two items."

  4. Save your changes.

Evaluating an Assertion

You can evaluate the AI Assertor's findings to learn more about why an assertion passed or failed. After running your test, open the AI Assertor and choose the assertion you want to evaluate. Click Evaluate and review the report that appears.

You can chain an Edit tool to the results output of the AI Assertor to view all the passing and failing assertions and the analysis reason for each of them in one place.

If you want to change your assertion, you can do so and click Evaluate again, which will query the LLM with the updated assertion against the previous input. In this way, you can fine-tune your assertions until they are functioning the way you want. If you find the AI struggling to fully comprehend your assertion, providing it with an example or two of a passing or failing condition can help.

  • No labels