In this section:
CodeBeamer ALM is a popular browser-based platform for managing requirements. Parasoft DTP integrates with codeBeamer ALM, providing the following functionality:
The following requirements are only applicable if you are going to send test results to codeBeamer.
The configuration is performed by the Parasoft administrator and only needs to be set up once. Developers, testers, and other DTP end users should review the Usage section for instructions on how to use Parasoft with codeBeamer ALM.
Creating links between Parasoft with codeBeamer at the project level enables defects created in the Violations or Test Explorer view to be created in the correct project in codeBeamer.
Click the trash icon to remove a project association. Deleting the project association does not remove links in DTP explorer views to defects in codeBeamer. If an association is deleted and recreated later, existing links between violations and codeBeamer issues will be reactivated.
You can associate multiple projects in DTP with a project in codeBeamer, but you cannot associate the same DTP project with more than one codeBeamer project.
You can configure DTP to generate widgets and reports that help you demonstrate traceability between the requirements stored in codeBeamer and the test, static analysis, and build review data sent to DTP from Parasoft tools (C/C++test, dotTEST, Jtest).
Your source code files must be associated with the requirements stored in codeBeamer. See Associating Requirements with Files for instructions.
DTP interfaces that display and track traceability are enabled by deploying the External Application Traceability Report artifact shipped with the Traceability Pack. The Traceability Pack also includes the Sending Test Data to External Application flow, which automates part of the requirements traceability workflow. Refer to the Traceability Pack documentation for instructions on how to install the pack.
Use DTP Extension Designer to deploy the External Application Traceability Report and the Sending Test Data to External Application flow to your environment. Verify that DTP is connected codeBeamer as described in the Connecting DTP to CodeBeamer ALM Server section before deploying the artifacts.
The first step is to install the Traceability Pack artifact. The artifact is a collection of configuration files and assets that enable traceability.
Deploy the report components to your DTP environment after installing the Traceability Pack.
Deploying the External Application Traceability Report adds new widgets to Report Center, as well as a drill-down report. See Viewing the Traceability Report for instructions on adding the widgets and viewing the report.
This artifact sends test data to codeBeamer when DTP Data Collector retrieves test results from a Parasoft tool. This artifact ships with the Traceability Pack, which must be installed as described in Installing the Traceability Pack before deploying the flow. You should also verify that the DTP Enterprise Pack connection is to DTP is configured with the host name of the server running DTP.
By default, Enterprise Pack points to localhost
. See Server Settings for additional information.
After configuring the integration with codeBeamer ALM, developers, testers, and other users can leverage the functionality enabled by the integration.
The Test Explorer and Violations Explorer views enable you to create bugs and tasks for any test and violation, respectively, regardless of status. Refer to the following sections for details on creating codeBeamer assets in explorer views:
The following diagram shows how you could implement an automated infrastructure for integrating Parasoft DTP and Parasoft test execution tools into your codeBeamer ALM environment:
@test
or @req
annotation. See the C/C++test, dotTEST, or Jtest documentation for details on adding annotations.@test <codeBeamer test ID>
annotation to associate tests with items in a codeBeamer Test Cases tracker.@req <codeBeamer System Requirements Specification ID>
annotation to associate tests with items in a codeBeamer System Requirements Specifications tracker. If you deployed the Sending Test Data to External Application flow (see Deploying the Sending Test Data to External Application Flow), then unit and functional testing results will automatically be sent to codeBeamer when Data Collector receives the data from the Parasoft tool.
You can also manually send a POST request to the DTP REST API endpoint to send results from the DTP database to codeBeamer. Pass the DTP filter and build IDs as URL parameters in the API call:
curl -X POST -u <username>:<password> "https://<host>:<port>/grs/api/v1.7/linkedApps/configurations/1/syncTestCases?filterId=<filterID>&buildId=<buildID>" |
The filter and build IDs are available in the Test Explorer URL:
DTP will locate the test results that match the filterId
and buildId
parameters and send the data to the codeBeamer work items.
@test <ID>
, it will search for items in Test Case trackers with a matching ID in codeBeamer and update the item. No action will be taken if the unit test case IDs do not exist in codeBeamer. @req <ID>
, it will search for items in System Requirements Specifications trackers with a matching ID in codeBeamer. If a match is found, Test Runs will be added to the Test Case associated with the requirement. If there are no Test Cases for the requirement ID, then Test Case will be created and a Test Run will be added.After DTP processes the report and sends results to codeBeamer, you should expect a response similar to the following:
{ "createdTestSession": "DTPP-521", "created": [ "DTPP-519, testName = testBagSumAdd" ], "updated": [ "DTPP-519, testName = testBagSumAdd", "DTPP-518, testName = testBagSimpleAdd" ], "ignored": [ "MAGD-567, testName = testBagNegate", "QAP-512, testName = testTryThis3", "QAP-512, testName = testTryThis4", "MAGD-567, testName = testBagMultiply" ] } |
You will be able to view results in codeBeamer after sending the test data. The following image shows a System Requirement Specifications tracker in codeBeamer. The tracker contains several test cases.
You can drill down into a test case to view details, such as test runs for the test case.
Click on a test run to view execution details, including test file, build ID, and test author.
If the External Application Traceability Report has been deployed to your system (see Enabling the Requirements Traceability Report), you can add widgets to your dashboard to monitor traceability from requirements to tests, static analysis, code reviews for your project. The widgets also drill down to a report that includes additional details.
The widgets will appear in a separate Traceability category when adding widgets to your DTP dashboard. See Adding Widgets for general instructions on adding widgets.
You can configure the following settings:
Title | You can enter a new title to replace the default title that appears on the dashboard. |
---|---|
Filter | Choose Dashboard Settings to use the dashboard filter or choose a filter from the drop-down menu. See Creating and Managing Filters for additional information about filters. |
Target Build | This should be set to the build ID you executed the tests and code analysis under. You can use the build specified in the dashboard settings, the latest build, or a specific build from the drop-down menu. Also see Configuring Dashboard Settings. |
Type | Pie widget only . Choose either a Tests, Violations, or Reviews from the drop-down menu to show a pie chart detailing the status by type. Add instances of the widget configured to each type for a complete overview in your dashboard. |
Project | Choose a codeBeamer project from he drop-down menu. |
This widget shows the number of requirements from the specified codeBeamer project.
Click on the widget to open the Requirement Traceability report.
Unit testing, functional testing, static analysis, and peer reviews are common activities for verifying that requirements have been properly and thoroughly implemented. This widget shows the overall status of the project requirements in the context of those software quality activities. You can add a widget for each type of quality activity (tests, static analysis violations, reviews) to monitor the progress of requirements implementation for the project.
Mouse over a section of the chart to view details about quality activity type status. Click on the widget to open the Requirement Traceability report filtered by the selected type.
The report lists the codeBeamer requirements and data associated with them.
You can perform the following actions:
Clicking on a section of the codeBeamer Requirements - Pie widget opens a version of the report that includes only the quality activity type selected in the widget. You can use the drop-down menus to switch type and status.
The codeBeamer Requirement Details report provides additional information about the files, static analysis findings, and tests associated with a specific codeBeamer requirement. You can open this report by clicking on a requirement in the codeBeamer Requirement Traceability report.
The first tab shows the results of the tests that were executed to verify the specific requirement. Click on a test name to view the test in the Test Explorer.
The second tab shows the files associated with the specific requirement, as well as the static analysis violation detected in the files. You can click the link the Violations column to view the violations in the Violations Explorer, which provides additional details about the violations.
If the files include any change reviews or review findings, they will be shown in the third tab with links to view them in the Change Explorer.