Overview

You can use an LLM with your LLM provider account to assist you in creating tests from an OpenAPI/Swagger definition in SOAtest or to power the AI Assistant, which can also help you create tests and data sources. You must configure SOAtest & Virtualize to be able to use your LLM provider account in the LLM Provider preferences before you can take advantage of these features. OpenAI, Azure OpenAI, and other LLM providers with similar functionality are supported. See LLM Provider preferences to determine whether or not your LLM provider can be used.

The SOAtest & Virtualize-AI integration uses the LLM provider's REST API over HTTPS/TLS. When creating tests from an OpenAPI/Swagger definition, it collects summarized information from the user-provided OpenAPI document and includes it with a user-defined prompt plus SOAtest & Virtualize's custom prompts in the requests to the LLM.

The AI Assistant collects summarized information from Parasoft documentation along with a user-defined prompt plus SOAtest & Virtualize's custom prompts in the requests to the LLM. Below are common examples of what is sent to the LLM, depending on what is requested:

The diagram below illustrates what and how SOAtest & Virtualize communicates with an LLM.