You can use an LLM model with your OpenAI, Azure OpenAI or another LLM provider account to assist you in generating recommended fixes for static analysis violations, creating more robust unit tests during unit test generation, and generating improvements for existing unit tests. See Generating a Suggested Fix Use AI to fix failing generated tests, Working with Recommendations (the Fix with AI option), and Improving Tests with AI for more information. You must configure Jtest to be able to use your LLM provider account in the Parasoft Preferences. See Configuring LLM Provider Settings for details.
Note: Only LLM providers with a chat completions endpoint compatible with the OpenAI REST API are supported.The Jtest-LLM integration uses the LLM provider REST API over HTTPS/TLS.
For the AI-generated static analysis fixes, it sends: the source code of the entire method, part of the method, or the single line related to the static analysis violation, along with documentation regarding the static analysis rule that was violated. For the Improve [method_name] with AI action, it sends: the source code of the test method to improve, the source code of the method under test, and some additional information about the Jtest UTA preference settings, such as the configured version of JUnit or the parameterization library being used. For the Fix with AI and Use AI to fix failing generated tests options, it includes the failing test method's source code, the error stack trace for the failed test, and either the entire class under test (if it fits within the token limit) or just the method under test (if the whole class does not fit within the token limit). The diagram below illustrates what and how Jtest communicates with the LLM provider.