The LLM Provider panel allows you to configure the application to use your LLM account to generate recommended fixes for static analysis violations and failing unit tests, as well as improvements for unit tests. Functionality has been tested on GPT-4o and GPT-4o-mini from OpenAI.Prerequisites:

  • The "LLM Integration" feature must be enabled in your license.
  • Access to your OpenAI, Azure OpenAI or another LLM provider account is required.
    Note: Only LLM providers with a chat completions endpoint compatible with the OpenAI REST API are supported.

Follow these steps to enable the LLM functionality:

  1. In your IDE, click Parasoft in the menu bar and choose Preferences.

  2. Select LLM Provider
  3. Set the following preferences:
  • Enable: Enables generating AI-recommended fixes for static analysis violations and failing unit tests, as well as and improvements for unit tests.

  • Provider: Choose between OpenAI, Azure OpenAI or Other.
  • Configuration (OpenAI)
    • API key: Your OpenAI token.
    • Organization ID: The Organization ID you want to use when accessing the OpenAI client. Optional. If no Organization ID is supplied, the default organization ID for your account will be used.
    • Test Connection: Click to test the connection to OpenAI.
    • Model > Chat: The name of your OpenAI model, for example gpt-4o.
    • Model > Embedding: (Optional) The name of your OpenAI embedding model.
  • Configuration (Azure OpenAI)
    • Resource Name: The Azure resource that contains your deployed Azure OpenAI model.
    • API key: Your Azure OpenAI token.
    • Deployment ID > Chat: The deployment name of your Azure OpenAI model.
    • Deployment ID > Embedding: (Optional) The deployment name of your Azure OpenAI embedding model.
    • Test Connection: Click to test the connection to Azure OpenAI.
  • Configuration (Other)
    • Base URL: The URL of the location where your LLM model is deployed.
    • API key / Access token: Your LLM provider token.
    • Model > Chat: The name of your LLM model.
    • Model > Embedding: (Optional) The name of your LLM embedding model.
    • Test Connection: Click to test the connection to the LLM provider.
LLMs may produce inaccurate information, which may affect product features dependent on LLMs.

Proxy Settings

Eclipse users: If you are going through a proxy to connect to your LLM account, your Eclipse proxy settings are used, not your Parasoft proxy settings.

  • No labels