If you have a license for complete Parasoft Development Testing Platform functionality, then your installation will include the components listed in the following table (also see Upgrading DTP from Standard to Enterprise Edition). Contact your Parasoft representative if you are interested in upgrading your license.
Component | Description |
---|---|
Report Center | Provides end-to-end SDLC process visibility and control. |
Project Center | Facilitates project planning, tracking, and analysis |
Policy Center | Sets management expectations and monitors compliance. See Connecting to Policy Center. |
License Server | Allows administrators to add and manage licenses |
Team Server | Enables centralized administration & sharing of team artifacts. |
User Administration | Allows administrators to grant and manage user permissions. |
You can access Report, Project, and Policy Center from the Centers menu:
If you have administration permissions, you can access the servers and administration pages from the Administration menu:
Additional Components
This documentation is focused on Report Center and other server-based components of DTP, but Parasoft also provides code analysis and test execution components for the DTP ecosystem, as well as desktop plug-ins that communicate with DTP.
DTP Engines
The DTP Engines for C/C++, Java, and .NET analyze code, execute tests, measure coverage, and perform other quality tasks. Extensions for using open source analyzers and tools are available in the Parasoft Marketplace: http://marketplace.parasoft.com.
The DTP Engines execute code analysis as part of the build process. The engines and third-party analyzers generate local HTML/XML files and publish details to DTP for aggregation, reporting, and analysis.
DTP IDE Plugins
Parasoft provides plugins for popular IDEs, such as Visual Studio, Eclipse, NetBeans, and IntelliJ. The plugins enable you to integrate DTP Engines and analyzers from the marketplace into the IDE for local GUI-based code analysis. Additionally, the plugins enable you to retrieve findings that have been processed by DTP and import them into the IDE. DTP Engines do not need to be plugged into the IDE for you to leverage the ability to download and import DTP findings. The IDE plugins can also be configured so that only findings with specific prioritization/metadata associated with them are downloaded and imported into the IDE.
The DTP Workflow
You can integrate DTP into your own development processes, but the following workflow describes the typical implementation.
Integrating DTP Engines with the Build
The DTP Engines ship with plugins for integration with your build tools (i.e., Maven, Ant, Gradle, MS Build, make, etc.). These integrations allow you to analyze code and send data to DTP automatically as part of the automated build processes and continuous integration (CI).
The engines also ship with plugins/integrations for the CI infrastructure (i.e., Jenkins, TeamCity). These integrations are also available in the Parasoft Marketplace (http://marketplace.parasoft.com/).
Capturing Observations
When DTP Engines execute analysis, they capture massive amounts of detailed data associated with the code called “observations.” Observations can be code quality data, such as static analysis violations, unit test failures, etc., as well as logistical information about the code, such as authorship, scope, and source control location.
Converting Data into Findings
When observations are sent to DTP, they are converted into “findings” and stored in the database. Findings are observations that have been analyzed, normalized, and aggregated into actionable data.
If the DTP Engines are configured to include source code information, then DTP will retrieve the source code and present to the user when viewing the normalized findings in the browser-based reporting interface (e.g., Prioritization View, Tests Explorer, Metrics Explorer).
The DTP Engines can also send copies of analyzed source code to the DTP server for display if DTP is unable to access the code from the source control system. Reasons for enabling access to source files using this approach include:
- security or networking constraints
- the code is ‘generated’ and not stored in source control
Working with Findings and Applying Additional Metadata
The reporting interface enables you to review, navigate, and filter findings. You can also set additional metadata, such as:
- Assign static analysis and flow analysis violations to team members for remediation
- Set due dates for remediation
- Set references to external systems, such as a defect tracking system
- Change the priority level of findings
- Set Risk/Impact categorizations
You can also leverage the REST API of DTP to extract details about findings for integration with external systems and apply analysis-flows (slices) configured within the Process Intelligence Engine (PIE). PIE slices can be triggered on demand or through the use of event-based triggers. Examples of PIE slice applications include:
- Generating derived data, such as risk in the application, using data available within DTP and other systems of record
- Triggering workflows in external systems, such as creating work-items or defects (e.g., in JIRA)
- Automatically applying metadata based on defined heuristics of your policy
Importing DTP Findings to the Developer Desktop
After findings are processed in DTP, developers can import them directly into their IDEs for remediation using a DTP IDE Plugin. Findings should be prioritized and filtered so that only tasks relevant to the developer are imported. If the developers also have DTP Engines installed and licensed, they can address the findings and reanalyze the code locally before committing to source control.
Continuing the Cycle
When the developers check their code back into source control, the continuous integration process picks up the change, and the workflow is repeated. This ensures that defects are detected and prevented from becoming software bugs later in the development process when the costs of remediation are much higher. As a result, Parasoft DTP facilitates continuous testing, enabling you to accelerate the SDLC while ensuring the safety, security, and reliability of your applications.