Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space FUNCTDEV and version SVC2025.1

Table of Contents
maxLevel2

Overview

You can use an LLM with your LLM provider account to assist you in creating tests from an OpenAPI/Swagger definition in SOAtest or to power the AI Assistant, which can also help you create tests and data sources. You must configure SOAtest & Virtualize to be able to use your LLM provider account in the LLM Provider preferences before you can take advantage of these features. OpenAI, Azure OpenAI, and other LLM providers with similar functionality are supported. See LLM Provider preferences to determine whether or not your LLM provider can be used.

The SOAtest & Virtualize-AI integration uses the LLM provider's REST API over HTTPS/TLS. When creating tests from an OpenAPI/Swagger definition, it collects summarized information from the user-provided OpenAPI document and includes it with a user-defined prompt plus SOAtest & Virtualize's custom prompts in the requests to the LLM.

AI Assistant collects summarized information from Parasoft documentation along with a user-defined prompt plus SOAtest & Virtualize's custom prompts in the requests to the LLM. Below are common examples of what is sent to the LLM, depending on what is requested:

  • Summarized definition file plus the user prompt for test creation
  • Summarized field names plus the user prompts for parameterization
  • Default Request payload plus the SOAtest prompt for value filling
  • User prompts for data source generation

The diagram below illustrates what and how SOAtest & Virtualize communicates with an LLM.