The LLM Provider panel allows you to configure the application to use your LLM account to generate recommended fixes for static analysis violations and failing unit tests, as well as improvements for unit tests and to interact with the AI Assistant. The functionality has been tested on GPT-4o, GPT-4o-mini and GPT-4.1 from OpenAI.

Prerequisites:

Follow these steps to enable the LLM functionality:

  1. In your IDE, click Parasoft in the menu bar and choose Preferences.

  2. Select LLM Provider
  3. Set the following preferences:
LLMs may produce inaccurate information, which may affect product features dependent on LLMs.

Eclipse users: If you are going through a proxy to connect to your LLM account, your Eclipse proxy settings are used, not your Parasoft proxy settings.