You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

You can use an LLM model with your OpenAI, Azure OpenAI or another LLM provider account to assist you in generating recommended fixes for static analysis violations, creating more robust unit tests during unit test generation, and generating improvements for existing unit tests and to power the AI Assistant. For more information, see:

You must configure Jtest to be able to use your LLM provider account in the Parasoft Preferences. See Configuring LLM Provider Settings for details.

Note: Only LLM providers with a chat completions endpoint compatible with the OpenAI REST API are supported. The Jtest-LLM integration uses the LLM provider REST API over HTTPS/TLS. Jtest sends the following to LLM:

  • For AI-generated static analysis fixes: the source code of the entire method, part of the method, or the single line related to the static analysis violation, along with documentation regarding the static analysis rule that was violated.
  • For the Improve [method_name] with AI action: the source code of the test method to improve, the source code of the method under test, and some additional information about the Jtest UTA preference settings, such as the configured version of JUnit or the parameterization library being used.
  • For the Fix with AI action: the failing test method's source code, the error stack trace for the failed test, and either the entire class under test (if it fits within the token limit) or just the method under test (if the whole class does not fit within the token limit).
  • When the Use AI to enhance generated tests is enabled while generating a test suite: the type and variable names of parameters for methods being tested, class names that the parameter variables come from, and calculated constraints or regular expressions to which values must be constrained. When created tests fail during the test creation process, the same data is sent as described in Fix with AI above.
  • For the AI Assistant: summarized information from Parasoft documentation along with a user-defined prompt plus Jtest's custom prompts. 

The diagram below illustrates what and how Jtest communicates with the LLM provider.





  • No labels