The LLM Provider panel allows you to configure the application to use your LLM account to interact with the AI Assistant. The functionality has been tested on GPT-4o, GPT-4o-mini and GPT-4.1 from OpenAI.
Prerequisites:
- The "LLM Integration" feature must be enabled in your license.
- Access to your OpenAI, Azure OpenAI or another LLM provider account is required.
LLM providers are supported if they:
- Have a chat completions endpoint compatible with the OpenAI REST API.
- Provide an embedding endpoint that is compatible with the OpenAI REST API (an embedding model with at least an 8,000 token limit is recommended for best performance).
- Are not reasoning models.
...
In your IDE, click Parasoft in the menu bar and choose choose Options (Visual Studio) or Preferences (Eclipse).
- Select LLM Provider.
- Set the following preferences:
...
Note |
---|
LLMs may produce inaccurate information, which may affect product features dependent on LLMs. |
Note | ||
---|---|---|
| ||
Eclipse users: If you are going through a proxy to connect to your LLM account, your Eclipse proxy settings are used, not your Parasoft proxy settings. |