In this section:
The following diagram shows how you could implement an automated infrastructure for integrating Parasoft DTP and Parasoft test execution tools into your Codebeamer ALM environment:
@test
or @req
annotation. Refer to your Parasoft tool documentation for details on adding annotations.@test <Codebeamer test ID>
annotation to associate tests with items in a Codebeamer Test Cases tracker.@req <Codebeamer System Requirements Specification ID>
annotation to associate tests with items in a Codebeamer System Requirements Specifications tracker. Parasoft creates a Verifies relationship between a test case and requirement in Codebeamer, so make sure that the Verifies option is enabled for the specific test case and requirements trackers in your Codebeamer configuration. If you deployed the Sending Test Data to External System flow (see Deploying the Sending Test Data to External System Flow), then unit and functional testing results will automatically be sent to Codebeamer when Data Collector receives the data from the Parasoft tool. By default, the flow forwards unit and functional test results that were received by Data Collector for any project, but you can configure the flow to only send data for a specific project (see Sending Results from a Specific DTP Project).
You can also manually send a POST request to the DTP REST API endpoint to send results from the DTP database to Codebeamer. Pass the DTP filter and build IDs as URL parameters in the API call:
curl -X POST -u <username>:<password> "http://<host>:<port>/grs/api/v1.7/linkedApps/configurations/1/syncTestCases?filterId=<filterID>&buildId=<buildID>" |
The following table describes the endpoint parameters:
Parameter | Value | Description | Required |
---|---|---|---|
filterId | integer | Specifies the filter ID containing the test data. The filter ID is an integer value and should not be confused with the filter name. | Required |
buildId | string | Specifies the build ID containing the test data. | Required |
| boolean | Setting to Setting to Default is | Optional |
The filter and build IDs are available in the Test Explorer URL:
DTP will locate the test results that match the filterId
and buildId
parameters and send the data to the Codebeamer work items.
@test <ID>
, it will search for items in Test Case trackers with a matching ID in Codebeamer and update the item. No action will be taken if the unit test case IDs do not exist in Codebeamer. Use the annotation to associate tests with test executions in Codebeamer, if necessary.@req <ID>
, it will search for items in System Requirements Specifications trackers with a matching ID in Codebeamer. If a match is found, Test Runs will be added to the Test Case associated with the requirement. If there are no Test Cases for the requirement ID, then Test Case will be created and a Test Run will be added.<DTP_INSTALL>/logs
directory. This log file contains progress information about sending test results from DTP to Codebeamer. After DTP processes the report and sends results to Codebeamer, you should expect a response similar to the following:
{ "createdTestSession": "DTPP-521", "created": [ "DTPP-519, testName = testBagSumAdd" ], "updated": [ "DTPP-519, testName = testBagSumAdd", "DTPP-518, testName = testBagSimpleAdd" ], "ignored": [ "MAGD-567, testName = testBagNegate", "QAP-512, testName = testTryThis3", "QAP-512, testName = testTryThis4", "MAGD-567, testName = testBagMultiply" ] } |
If you are using the Sending Test Data to External System flow to forward unit and functional test results, data will be sent to Codebeamer for all DTP projects by default. As a result, work items will be updated to include the tests collected for any DTP project that contain annotations matching Codebeamer IDs. You can configure the flow, however, to only send data for a specific project.
true
. This toggles on the ability to only send data to specific projects.Double-click the Configure Projects node. The logic in this node specifies which DTP projects should have their data sent to Codebeamer during the sync operation. These project configurations are stored in a JSON array.
There is a sample project configuration already defined as an example:
{ project: 'abc' } |
To add more project configurations, add a comma to the end of the last project configuration and add a new entry below it. For example:
msg.configuredProjects = [ { project: 'abc' }, { project: 'foobar' } ]; |
When the flow executes, only test results for the specified DTP project will be sent to Codebeamer.
You will be able to view results in Codebeamer after sending the test data. The following image shows a System Requirement Specifications tracker in Codebeamer. The tracker contains several test cases.
You can drill down into a test case to view details, such as test runs for the test case.
Click on a test run to view execution details, including test file, build ID, and test author.