In this section:
Introduction
CodeBeamer ALM is a popular browser-based platform for managing requirements. Parasoft DTP integrates with codeBeamer ALM, providing the following functionality:
- Ability to manually create bugs and tasks in codeBeamer ALM in from the Violations Explorer view.
- Ability to manually create bugs and tasks in codeBeamer ALM from the Test Explorer view.
- Ability to send, view, and update Parasoft test results in codeBeamer work items (Sending Test Data to CodeBeamer).
- Traceability from codeBeamer requirements to tests, static analysis results, and code reviews (see Viewing the Traceability Report).
Requirements
The following requirements are only applicable if you are going to send test results to codeBeamer.
- Tests executed by the following Parasoft tools are supported:
- C/C++test Professional, dotTEST, or Jtest 10.4.3 +
- Selenic 2020.1 +
- SOAtest 9.10.8 +
- You should have already created requirements in codeBeamer.
Configuration
The configuration is performed by the Parasoft administrator and only needs to be set up once. Developers, testers, and other DTP end users should review the Usage section for instructions on how to use Parasoft with codeBeamer ALM.
Connecting DTP to CodeBeamer ALM Server
- Choose Report Center Settings from the settings (gear icon) drop-down menu.
- Choose External System and choose codeBeamer from the System type drop-down menu.
- Enable the Enabled option.
- Enter a name for your instance of codeBeamer ALM in the Name field. The name is required but does not affect the connection settings or render in any other interfaces.
- Enter the URL for the codeBeamer server in the Application URL field. The URL should include the protocol, host, and port number. Do not include paths or parameters.
- The Display URL field is rendered in DTP interfaces when links to your codeBeamer system are created. This URL should include additional paths that may be necessary to access codeBeamer in a browser.
- Enter login credentials in the Username and Password/API token fields. The login must have sufficient privileges to create issues in the codeBeamer projects specified in the Project Associations section.
- Click Test Connection to verify your settings and click Save.
Associating Parasoft Projects with CodeBeamer ALM Projects
Associating a Parasoft project with a codeBeamer project enables you to create defects from the Violations or Test Explorer views that are linked to the correct project in codeBeamer. The association is also important when using the the Sending Test Data to External System flow. You can associate multiple projects in DTP with a project in codeBeamer, but you cannot associate the same DTP project with more than one codeBeamer project.
- Click Create Project Association and choose a project from the DTP Project drop-down menu in the overlay.
- Enter the name of a codeBeamer project in the External Project field and click Create to save the association.
Click the trash icon to remove a project association. Deleting the project association does not remove links from DTP explorer views to defects in codeBeamer. If an association is deleted and recreated later, existing links between violations and codeBeamer issues will be reactivated.
You can reconfigure an existing association between DTP and codeBeamer projects:
- Click the pencil icon and choose either a different DTP project from the drop-down menu or specify the name of a different codeBeamer project in the External Project field.
- Click Save.
Enabling the Requirements Traceability Report
You can configure DTP to generate widgets and reports that help you demonstrate traceability between the requirements stored in codeBeamer and the test, static analysis, and build review data sent to DTP from Parasoft tools (C/C++test, dotTEST, Jtest).
If you want the Traceability Report to include code review and static analysis information, you must associate your source code files with requirements stored in codeBeamer. See Associating Requirements with Files for instructions on enabling this optional feature.
DTP interfaces that display and track traceability are enabled by deploying the External System Traceability Report artifact shipped with the Traceability Pack. The Traceability Pack also includes the Sending Test Data to External System flow, which automates part of the requirements traceability workflow. Refer to the Traceability Pack documentation for instructions on how to install the pack.
Use DTP Extension Designer to deploy the External System Traceability Report and the Sending Test Data to External System flow to your environment. Verify that DTP is connected codeBeamer as described in the Connecting DTP to CodeBeamer ALM Server section before deploying the artifacts.
Installing the Traceability Pack
The first step is to install the Traceability Pack artifact. The artifact is a collection of configuration files and assets that enable traceability.
- Choose Extension Designer from the settings menu (gear icon).
- Click the Configuration tab to open Artifact Manager.
- Click Upload Artifact and browse for the traceability-pack-<version>.zip archive (also see Downloading and Installing Artifacts).
- Click Install and a collection of assets and configuration files for enabling traceability will be installed.
Deploying the Traceability Report
Deploy the report components to your DTP environment after installing the Traceability Pack.
- Open Extension Designer and click on the Services tab.
- Choose an existing service to deploy the artifact or create a new service in the DTP Workflows category. Refer to Working with Services for additional information on organizing services and artifacts.
- If you are adding the artifact to an existing service, add a new Flow tab (see Working with Flows) and choose Import from the ellipses menu.
- Choose Library> Workflows> Traceability Pack> External System Traceability Report and click Import.
- Click inside the Flow tab to drop the nodes into the service and click Deploy.
Deploying the External System Traceability Report adds new widgets to Report Center, as well as a drill-down report. See Viewing the Traceability Report for instructions on adding the widgets and viewing the report.
Deploying the Sending Test Data to External System Flow
This artifact sends test data to codeBeamer when DTP Data Collector retrieves test results from a Parasoft tool. This artifact ships with the Traceability Pack, which must be installed as described in Installing the Traceability Pack before deploying the flow.
- Open Extension Designer and click on the Services tab.
- Choose an existing service to deploy the artifact or create a new service in the DTP Workflows category. Refer to Working with Services for additional information on organizing services and artifacts.
- If you are adding the artifact to an existing service, add a new Flow tab (see Working with Flows) and choose Import from the ellipses menu.
- Choose Library> Workflows> Traceability Pack> Sending Test Data to External System and click Import.
- Click inside the Flow tab to drop the nodes into the service and click Deploy.
Advanced Configuration
You can modify the ExternalSystemSettings.properties configuration file located in the <DTP_DATA_DIR>/conf directory to change the default behavior of the integration. DTP's out-of-the-box integration with codeBeamer is configured to use default or commonly-used fields and work item types. If you customized your codeBeamer system, however, then you can configure the following settings to align data in DTP with your custom configuration.
codeBeamer.import.chunkSize | Specifies the maximum number of test case results sent to codeBeamer in a single run. Default:
|
---|---|
codeBeamer.tracker.requirements | Specifies the name of a tracker in codeBeamer that contain requirements. This enables you to associate a custom requirements tracker you may have configured in codeBeamer with test results in DTP. The configured tracker also determines which requirements are presented on Traceability report. Default:
"Information" and "Folder" requirement types are ignored in DTP reports. |
codeBeamer.tracker.test | Specifies the name of a tracker in codeBeamer that contain test cases. This enables you to associate custom test case trackers you may have configured in codeBeamer with test results in DTP. Default:
Parasoft creates a Verifies relationship between a test case and requirement in codeBeamer, so make sure that the Verifies option is enabled for the specific test case and requirements trackers in your codeBeamer configuration. |
codeBeamer.workItemType.bug.status | Specifies the status of bugs that are created in codeBeamer when creating work items in the DTP Violations Explorer and Test Explorer views. Default: |
codeBeamer.workItemType.bug | Specifies the type of work item to create in codeBeamer when creating new bugs from the DTP Violation Explorer and Test Explorer. This enables you to associate custom bug trackers you may have configured in codeBeamer with work items created from DTP. By default, the property is not set. As a result, bug work items created in DTP are associated with bug work items in codeBeamer. |
codeBeamer.workItemType.task.status | Specifies the status of tasks that are created in codeBeamer when creating work items in the DTP Violations Explorer and Test Explorer views. Default: |
codeBeamer.workItemType.task | Specifies the type of work item to create in codeBeamer when creating new tasks from the DTP Violation Explorer and Test Explorer. This enables you to associate custom task trackers you may have configured in codeBeamer with work items created from DTP. By default, the property is not set. As a result, task work items created in DTP are associated with task work items in codeBeamer. |
codeBeamerIssueUrl | Specifies the URL template for linking work items created in the DTP Violation Explorer and Test Explorer to work items in codeBeamer. Default:
|
Usage
After configuring the integration with codeBeamer ALM, developers, testers, and other users can leverage the functionality enabled by the integration.
Manually Creating Bugs and Tasks in CodeBeamer ALM
The Test Explorer and Violations Explorer views enable you to create bugs and tasks for any test and violation, respectively, regardless of status. Refer to the following sections for details on creating codeBeamer assets in explorer views:
- See Creating an Issue in a Third-party System for instructions on how to manually create bugs and tasks in codeBeamer ALM from the Violations Explorer view.
- See Creating an Issue in a Third-party System for instructions on how to manually create bugs and tasks in codeBeamer ALM from the Test Explorer view.
Sending Test Data to CodeBeamer
The following diagram shows how you could implement an automated infrastructure for integrating Parasoft DTP and Parasoft test execution tools into your codeBeamer ALM environment:
- Create items in codeBeamer trackers. The items will be associated with tests executed by Parasoft tools. You can create requirements (i.e., items in a System Requirement Specifications tracker) or test cases (i.e., items in a Test Cases tracker), for example.
- In your test file, add the codeBeamer test case or requirement IDs using the
@test
or@req
annotation. Refer to your Parasoft tool documentation for details on adding annotations.- Use the
@test <codeBeamer test ID>
annotation to associate tests with items in a codeBeamer Test Cases tracker. - Use the
@req <codeBeamer System Requirements Specification ID>
annotation to associate tests with items in a codeBeamer System Requirements Specifications tracker. - You can get the work item ID from various parts of the codeBeamer interface, such as the URL:
- Use the
- Execute your tests as part of the CI process. You can also manually execute the tests from the IDE.
- As part of the test execution, Parasoft test execution tools will tag the results with the filter and build IDs and send the data to DTP. You can verify the results in DTP by adding Test Widgets to your DTP dashboard and setting the filter and build ID. Developers can download the test execution data from DTP into their IDEs so that they can address any failed tests.
If you deployed the Sending Test Data to External System flow (see Deploying the Sending Test Data to External System Flow), then unit and functional testing results will automatically be sent to codeBeamer when Data Collector receives the data from the Parasoft tool. By default, the flow forwards unit and functional test results that were received by Data Collector for any project, but you can configure the flow to only send data for a specific project (see Sending Results from a Specific DTP Project).
You can also manually send a POST request to the DTP REST API endpoint to send results from the DTP database to codeBeamer. Pass the DTP filter and build IDs as URL parameters in the API call:curl -X POST -u <username>:<password> "http://<host>:<port>/grs/api/v1.7/linkedApps/configurations/1/syncTestCases?filterId=<filterID>&buildId=<buildID>"
The filter and build IDs are available in the Test Explorer URL:
DTP will locate the test results that match the
filterId
andbuildId
parameters and send the data to the codeBeamer work items.- When DTP locates results with an
@test <ID>
, it will search for items in Test Case trackers with a matching ID in codeBeamer and update the item. No action will be taken if the unit test case IDs do not exist in codeBeamer. - When DTP locates results with an
@req <ID>
, it will search for items in System Requirements Specifications trackers with a matching ID in codeBeamer. If a match is found, Test Runs will be added to the Test Case associated with the requirement. If there are no Test Cases for the requirement ID, then Test Case will be created and a Test Run will be added. - An external-app-sync.log file will also be written to the the <DTP_INSTALL>/logs directory. This log file contains progress information about sending test results from DTP to codeBeamer.
- When DTP locates results with an
After DTP processes the report and sends results to codeBeamer, you should expect a response similar to the following:
{ "createdTestSession": "DTPP-521", "created": [ "DTPP-519, testName = testBagSumAdd" ], "updated": [ "DTPP-519, testName = testBagSumAdd", "DTPP-518, testName = testBagSimpleAdd" ], "ignored": [ "MAGD-567, testName = testBagNegate", "QAP-512, testName = testTryThis3", "QAP-512, testName = testTryThis4", "MAGD-567, testName = testBagMultiply" ] }
Sending Results from a Specific DTP Project
If you are using the Sending Test Data to External System flow to forward unit and functional test results, data will be sent to codeBeamer for all DTP projects by default. As a result, work items will be updated to include the tests collected for any DTP project that contain annotations matching codeBeamer IDs. You can configure the flow, however, to only send data for a specific project.
- Open Extension Designer and open the service where the Sending Test Data is deployed.
- Drag a new switch node to the workspace.
- Select and delete the connector line between the "DataCollectorProcessedEvent" node and the "Is dynamic analysis" node.
- Drag a new connector from the "DataCollectorProcessedEvent" node to the switch node and from the switch node to the "Is dynamic analysis" node.
Double-click the node and specify the following string in the Property field:
event.message.resultsSession.project
- Specify the name of the DTP project in the string field.
- (Optional) Provide a more descriptive name for the node.
- Click Done to finish configuring the node and click Deploy.
When the flow executes, only test results for the specified DTP project will be sent to codeBeamer.
Viewing Results in CodeBeamer
You will be able to view results in codeBeamer after sending the test data. The following image shows a System Requirement Specifications tracker in codeBeamer. The tracker contains several test cases.
You can drill down into a test case to view details, such as test runs for the test case.
Click on a test run to view execution details, including test file, build ID, and test author.
Viewing the Traceability Report
If the External System Traceability Report has been deployed to your system (see Enabling the Requirements Traceability Report), you can add widgets to your dashboard to monitor traceability from requirements to tests, static analysis, code reviews for your project. The widgets also drill down to a report that includes additional details.
Adding and Configuring the Widgets
The widgets will appear in a separate Traceability category when adding widgets to your DTP dashboard. See Adding Widgets for general instructions on adding widgets.
You can configure the following settings:
Title | You can enter a new title to replace the default title that appears on the dashboard. |
---|---|
Filter | Choose Dashboard Settings to use the dashboard filter or choose a filter from the drop-down menu. See Creating and Managing Filters for additional information about filters. |
Target Build | This should be set to the build ID you executed the tests and code analysis under. You can use the build specified in the dashboard settings, the latest build, or a specific build from the drop-down menu. Also see Configuring Dashboard Settings. |
Type | Pie widget only . Choose either a Tests, Violations, or Reviews from the drop-down menu to show a pie chart detailing the status by type. Add instances of the widget configured to each type for a complete overview in your dashboard. |
Project | Choose a codeBeamer project from he drop-down menu. |
Requirements Widget
This widget shows the number of requirements from the specified codeBeamer project.
Click on the widget to open the Requirement Traceability report.
Test Coverage Widget
This widget shows the percentage of requirements covered by tests against all requirements in the project.
Click the center of the widget to open the main Requirement Traceability report.
The colored-in segment represents the requirements covered by tests. Click on the segment to open the Requirement Traceability report filtered to the With Tests category.
Pie Widget
Unit testing, functional testing, static analysis, and peer reviews are common activities for verifying that requirements have been properly and thoroughly implemented. This widget shows the overall status of the project requirements in the context of those software quality activities. You can add a widget for each type of quality activity (tests, static analysis violations, reviews) to monitor the progress of requirements implementation for the project.
Mouse over a section of the chart to view details about quality activity type status. Click on the widget to open the Requirement Traceability report filtered by the selected type.
Requirements Implementation Status by Tests
Requirements Implementation Status by Violations
Requirements Implementation by Reviews
Understanding the Requirement Traceability Report
The report lists the codeBeamer requirements and data associated with them.
You can perform the following actions:
- Disable or enable the Show files/reviews option if you want to hide the Files and Reviews columns in the report. The Files and Reviews columns will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). Disabling the Files and Reviews columns on this screen hides the related tabs in the Requirement Details report.
- Click on a link in the Key column to view the tracker in codeBeamer ALM.
- Click on a link in the Summary column or one of the Test columns to view the test-related information associated with the tracker in the codeBeamer Requirement Details Report.
- Click on a link in one of the Files columns to view the static analysis-related information associated with the tracker in the codeBeamer Requirement Details Report.
- Click on a link in one of the Reviews columns to view the change review-related information associated with the tracker in the codeBeamer Requirement Details Report.
Requirement Traceability Report by Type
Clicking on a section of the codeBeamer Requirements - Pie widget opens a version of the report that includes only the quality activity type selected in the widget. You can use the drop-down menus to switch type and status. You can also disable or enable the Show files/reviews option if you want to hide the Files and Reviews columns in the report. The Files and Reviews columns will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). Disabling the Files and Reviews columns on this screen hides the related tabs in the Requirement Details report.
Understanding the Requirement Details Report
The codeBeamer Requirement Details report provides additional information about the files, static analysis findings, and tests associated with a specific codeBeamer requirement. You can open this report by clicking on a requirement in the codeBeamer Requirement Traceability report.
The first tab shows the results of the tests that were executed to verify the specific work item.
You can click on the View results in Test Explorer link to view all of the tests associated with the work item in the Test Explorer.
You can also click on individual test names in the table to view each test in the Test Explorer.
The second tab shows the files associated with the specific requirement, as well as the static analysis violation detected in the files. You can click the link the Violations column to view the violations in the Violations Explorer, which provides additional details about the violations.
This tab will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). If you did not map requirements to files, you can hide this tab by disabling the Show files/reviews option on the main traceability report page and reloading the details report.
If the files include any change reviews or review findings, they will be shown in the third tab with links to view them in the Change Explorer.
This tab will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). If you did not map requirements to files, you can hide this tab by disabling the Show files/reviews option on the main traceability report page and reloading the details report.