You can use an LLM model with your OpenAI, Azure OpenAI or another LLM provider account to assist you in generating recommended fixes for static analysis violations and to power the AI Assistant. See Generating a Suggested Fix and The AI Assistant for more information. You must configure dotTEST to be able to use your LLM provider account in the Parasoft Preferences. See Configuring LLM Provider Settings to determine whether your LLM provider can be used.

Note: The functionality does not support VB.NET.

The dotTEST-LLM integration uses the LLM provider REST API over HTTPS/TLS. dotTEST sends the following to LLM:

  • For AI-generated static analysis fixes: the source code of the entire method, part of the method, or the single line related to the static analysis violation, along with documentation regarding the static analysis rule that was violated.
  • For the AI Assistant: summarized information from Parasoft documentation along with a user-defined prompt plus dotTEST's custom prompts. 

The diagram below illustrates what and how dotTEST communicates with the LLM provider.





  • No labels