You can use an LLM model with your OpenAI, Azure OpenAI or another LLM provider account to assist you in generating recommended fixes for static analysis violations. See Generating a Suggested Fix for more information. You must configure dotTEST to be able to use your LLM provider account in the Parasoft Preferences. See Configuring LLM Provider Settings for details.
Note: Only LLM providers with a chat completions endpoint compatible with the OpenAI REST API are supported.
Note:
- LLM integration in Visual Studio is supported in VS 2019 and later.
- The functionality does not support VB.NET.
The dotTEST-LLM integration uses the LLM provider REST API over HTTPS/TLS.
For the AI-generated static analysis fixes, it sends: the source code of the entire method, part of the method, or the single line related to the static analysis violation, along with documentation regarding the static analysis rule that was violated. The diagram below illustrates what and how dotTEST communicates with the LLM provider.