Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The LLM Provider panel allows you to configure the application to use your LLM account to interact with the AI Assistant. The functionality has been tested on GPT-4o, GPT-4o-mini and GPT-4.1 from OpenAI.

Prerequisites:

  • The "LLM Integration" feature must be enabled in your license.
  • Access to your OpenAI, Azure OpenAI or another LLM provider account is required.
  • LLM providers are supported if they:

    • Have a chat completions endpoint compatible with the OpenAI REST API.
    • Provide an embedding endpoint that is compatible with the OpenAI REST API (an embedding model with at least an 8,000 token limit is recommended for best performance).
    • Are not reasoning models.

...

  1. In your IDE, click Parasoft in the menu bar and choose choose Options (Visual Studio) or Preferences (Eclipse).

  2. Select LLM Provider
  3. Set the following preferences:

...

Note
LLMs may produce inaccurate information, which may affect product features dependent on LLMs.
Note
titleProxy Settings

Eclipse users: If you are going through a proxy to connect to your LLM account, your Eclipse proxy settings are used, not your Parasoft proxy settings.