The LLM Provider panel allows you to configure the application to use your LLM account to interact with the AI Assistant. The functionality has been tested on GPT-4o, GPT-4o-mini and GPT-4.1 from OpenAI.
Prerequisites:
- The "LLM Integration" feature must be enabled in your license.
- Access to your OpenAI, Azure OpenAI or another LLM provider account is required.
LLM providers are supported if they:
- Have a chat completions endpoint compatible with the OpenAI REST API.
- Provide an embedding endpoint that is compatible with the OpenAI REST API (an embedding model with at least an 8,000 token limit is recommended for best performance).
- Are not reasoning models.
Follow these steps to enable the LLM functionality:
In your IDE, click Parasoft in the menu bar and choose Preferences.
- Select LLM Provider.
- Set the following preferences:
Enable: Enables interacting with the AI Assistant
- Provider: Choose between OpenAI, Azure OpenAI or Other.
- Configuration (OpenAI)
- API key: Your OpenAI token.
- Organization ID: The Organization ID you want to use when accessing the OpenAI client. Optional. If no Organization ID is supplied, the default organization ID for your account will be used.
- Test Connection: Click to test the connection to OpenAI.
- Model > Chat: The name of your OpenAI model, for example gpt-4o.
- Model > Embedding: (Optional) The name of your OpenAI embedding model.
- Configuration (Azure OpenAI)
- Resource Name: The Azure resource that contains your deployed Azure OpenAI model.
- API key: Your Azure OpenAI token.
- Deployment ID > Chat: The deployment name of your Azure OpenAI model.
- Deployment ID > Embedding: (Optional) The deployment name of your Azure OpenAI embedding model.
- Test Connection: Click to test the connection to Azure OpenAI.
- Configuration (Other)
- Base URL: The URL of the location where your LLM model is deployed.
- API key / Access token: Your LLM provider token.
- Model > Chat: The name of your LLM model.
- Model > Embedding: (Optional) The name of your LLM embedding model.
- Test Connection: Click to test the connection to the LLM provider.
LLMs may produce inaccurate information, which may affect product features dependent on LLMs.
Proxy Settings
If you are going through a proxy to connect to your LLM account, your Eclipse proxy settings are used, not your Parasoft proxy settings.