Overview
You can use an LLM with your LLM provider account to assist you in creating tests in SOAtest (see Writing User-Defined Prompts below for tips about writing effective prompts when creating tests) or to power the The AI Assistant. You must configure SOAtest & Virtualize to be able to use your LLM provider account in the LLM Provider preferences before you can take advantage of these features. OpenAI, Azure OpenAI, and other LLM providers with similar functionality are supported. See LLM Provider preferences to determine whether or not your LLM provider can be used.
The SOAtest & Virtualize-AI integration uses the LLM provider's REST API over HTTPS/TLS. For the AI Assistant, it collects summarized information from Parasoft documentation along with a user-defined prompt plus SOAtest & Virtualize's custom prompts in the requests to the LLM. For test creation, it collects summarized information from the user-provided OpenAPI document and includes it with a user-defined prompt plus SOAtest & Virtualize's custom prompts in the requests to the LLM. The diagram below illustrates what and how SOAtest & Virtualize communicates with an LLM.
Writing User-Defined Prompts
When writing a general user-defined prompt for an app, it can be helpful to tell the LLM what kind of app it is. For example:
Generate test scenarios for this banking app.
You can also write more detailed user-defined prompts for more control over the kinds of test scenarios generated. Here is an example prompt for CRUD:
Generate 5 test scenarios that follow the pattern of:
1) GET request on a collection
2) POST request to add to the collection
3) GET request on the collection
4) GET request on specifically the newly added item to the collection
5) PUT request on specifically the newly added item to the collection
6) GET request on specifically the newly added item to the collection
7) DELETE request on specifically the newly added item to the collection
8) GET request on the collection