The LLM Provider panel allows you to configure the application to use your LLM account to generate recommended fixes for static analysis violations and failing unit tests, as well as improvements for unit tests and to interact with the AI Assistant. The functionality has been tested on GPT-4o, GPT-4o-mini and GPT-4.1 from OpenAI.
Prerequisites:
LLM providers are supported if they:
Follow these steps to enable the LLM functionality:
In your IDE, click Parasoft in the menu bar and choose Preferences.
Enable: Enables generating AI-recommended fixes for static analysis violations and failing unit tests, as well as improvements for unit tests, and interacting with the AI Assistant.
LLMs may produce inaccurate information, which may affect product features dependent on LLMs. |
Eclipse users: If you are going through a proxy to connect to your LLM account, your Eclipse proxy settings are used, not your Parasoft proxy settings. |