Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

You must configure Jtest to be able to use your LLM provider account in the Parasoft Preferences. See Configuring Open AI Settings for details.

Note: Only LLM providers with a chat completions endpoint compatible with the OpenAI REST API are supported. The Jtest-LLM integration uses the LLM provider REST API over HTTPS/TLS. Jtest sends the following to LLM:

  • For AI-generated static analysis fixes: the source code of the entire method, part of the method, or the single line related to the static analysis violation, along with documentation regarding the static analysis rule that was violated.

...