The OpenAI panel allows you to configure the application to use your OpenAI or Azure OpenAI account to generate recommended fixes for static analysis violations and improvements for unit tests. Functionality has been tested on GPT-3.5 turbo and GPT-4.

Prerequisites

  • The "LLM Integration" feature must be enabled in your license.
  • Access to your OpenAI or Azure OpenAI account is required.

Follow these steps to enable the OpenAI functionality:

  1. In your IDE, click Parasoft in the menu bar and choose Preferences.

  2. Select OpenAI
  3. Set the following preferences:
    • Enable: Enables generating AI-recommended fixes for static analysis violations and improvements for unit tests.

    • Provider: Choose between OpenAI and Azure OpenAI.
    • Configuration (OpenAI)
      • API key: Your OpenAI token.
      • Organization ID: The Organization ID you want to use when accessing the OpenAI client. Optional. If no Organization ID is supplied, the default organization ID for your account will be used.
      • Test Connection: Click to test the connection to OpenAI.
      • Model > Name: The name of the OpenAI model you are using, for example gpt-3.5-turbo-16k or gpt-4.
    • Configuration (Azure OpenAI)
      • Resource Name: The Azure resource that contains your deployed Azure OpenAI model.
      • Deployment ID: The deployment name of your Azure OpenAI model.
      • API key: Your Azure OpenAI token.
      • Test Connection: Click to test the connection to Azure OpenAI.

Note: OpenAI may produce inaccurate information, which may affect product features dependent on OpenAI.

Proxy Settings

Eclipse users: If you are going through a proxy to connect to your OpenAI or Azure OpenAI account, your Eclipse proxy settings are used, not your Parasoft proxy settings.

  • No labels