You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Current »

In this section:

Introduction

CodeBeamer ALM is a popular browser-based platform for managing requirements. Parasoft DTP integrates with codeBeamer ALM, providing the following functionality:

  • Ability to manually create bugs and tasks in codeBeamer ALM in from the Violations Explorer view.  
  • Ability to manually create bugs and tasks in codeBeamer ALM from the Test Explorer view.
  • Ability to send, view, and update Parasoft test results in codeBeamer work items (Sending Test Data to CodeBeamer).
    • For users who plan to track traceability using traceability reports in codeBeamer. In order to do this, you need to first send test results from Parasoft DTP to codeBeamer.
  • Traceability from codeBeamer requirements to tests, static analysis results, and code reviews (see Viewing the Traceability Report).
    • For users who plan to track traceability using traceability reports in Parasoft DTP. In order to do this, use @req to associate your automated tests in Parasoft to codeBeamer requirements.

Requirements

The following requirements are only applicable if you are going to send test results to codeBeamer.

  • Tests executed by the following Parasoft tools are supported:
    • C/C++test Professional, dotTEST, or Jtest 10.4.3 +
    • Selenic 2020.1 +
    • SOAtest 9.10.8 +
  • You should have already created requirements in codeBeamer. 

Configuration

The configuration is performed by the Parasoft administrator and only needs to be set up once. Developers, testers, and other DTP end users should review the Usage section for instructions on how to use Parasoft with codeBeamer ALM. 

Connecting DTP to CodeBeamer ALM Server

  1. Choose Report Center Settings from the settings (gear icon) drop-down menu.
  2. Choose External System, click Edit Settings, and choose codeBeamer from the System type drop-down menu.
  3. Enable the Enabled option.
  4. Enter a name for your instance of codeBeamer ALM in the Name field. The name is required but does not affect the connection settings or render in any other interfaces.
  5. Enter the URL for the codeBeamer server in the Application URL field. The URL should include the protocol, host, and port number. Do not include paths or parameters.
  6. The Display URL field defines the URL which is displayed in Parasoft DTP pages when links to your codeBeamer system are presented in a web browser. Typically, this should be the same as the above Application URL field. However, it might be different, for example, when you work in a reverse proxy environment and links to codeBeamer from the user's local web browser with Parasoft DTP are different than from the Parasoft DTP server. 
  7. Enter login credentials in the Username and Password/API token fields. The login must have sufficient privileges to create issues in the codeBeamer projects specified in the Project Associations section.
  8. Click Test Connection to verify your settings and click Confirm.

Associating Parasoft Projects with CodeBeamer ALM Projects

Associating a Parasoft project with a codeBeamer project enables you to create defects from the Violations or Test Explorer views that are linked to the correct project in codeBeamer. The association is also important when using the the Sending Test Data to External System flowYou can associate multiple projects in DTP with a project in codeBeamer, but you cannot associate the same DTP project with more than one codeBeamer project.

  1. Click Create Project Association and choose a project from the DTP Project drop-down menu in the overlay. 
  2. Enter the name of a codeBeamer project in the External Project field and click Create to save the association.

Click the trash icon to remove a project association. Deleting the project association does not remove links from DTP explorer views to defects in codeBeamer. If an association is deleted and recreated later, existing links between violations and codeBeamer issues will be reactivated. 

You can reconfigure an existing association between DTP and codeBeamer projects:

  1. Click the pencil icon and choose either a different DTP project from the drop-down menu or specify the name of a different codeBeamer project in the External Project field.
  2. Click Save

Enabling the Requirements Traceability Report

You can configure DTP to generate widgets and reports that help you demonstrate traceability between the requirements stored in codeBeamer and the test, static analysis, and build review data sent to DTP from Parasoft tools (C/C++test, dotTEST, Jtest).

If you want the Traceability Report to include code review and static analysis information, you must associate your source code files with requirements stored in codeBeamer. See Associating Requirements with Files for instructions on enabling this optional feature.

DTP interfaces that display and track traceability are enabled by deploying the External System Traceability Report artifact shipped with the Traceability Pack. The Traceability Pack also includes the Sending Test Data to External System flow, which automates part of the requirements traceability workflow. Refer to the Traceability Pack documentation for instructions on how to install the pack.  

Use DTP Extension Designer to deploy the External System Traceability Report and the Sending Test Data to External System flow to your environment. Verify that DTP is connected codeBeamer as described in the Connecting DTP to CodeBeamer ALM Server section before deploying the artifacts.

Installing the Traceability Pack

The first step is to install the Traceability Pack artifact. The artifact is a collection of configuration files and assets that enable traceability.  

  1. Choose Extension Designer from the settings menu (gear icon).
  2. Click the Configuration tab to open Artifact Manager.
  3. Click Upload Artifact and browse for the traceability-pack-<version>.zip archive (also see Downloading and Installing Artifacts).
  4. Click Install and a collection of assets and configuration files for enabling traceability will be installed.

Deploying the Traceability Report

Deploy the report components to your DTP environment after installing the Traceability Pack. 

  1. Open Extension Designer and click on the Services tab.
  2. Choose an existing service to deploy the artifact or create a new service in the DTP Workflows category. Refer to Working with Services for additional information on organizing services and artifacts.
  3. If you are adding the artifact to an existing service, add a new Flow tab (see Working with Flows) and choose Import from the ellipses menu.
  4. Choose Local> Flows> Workflows> Traceability Pack> External System Traceability Report and click Import.
  5. Click inside the Flow tab to drop the nodes into the service and click Deploy

Deploying the External System Traceability Report adds new widgets to Report Center, as well as a drill-down report. See Viewing the Traceability Report for instructions on adding the widgets and viewing the report.  

Deploying the Sending Test Data to External System Flow

This artifact sends test data to codeBeamer when DTP Data Collector retrieves test results from a Parasoft tool. This artifact ships with the Traceability Pack, which must be installed as described in Installing the Traceability Pack before deploying the flow. 

  1. Open Extension Designer and click on the Services tab.
  2. Choose an existing service to deploy the artifact or create a new service in the DTP Workflows category. Refer to Working with Services for additional information on organizing services and artifacts.
  3. If you are adding the artifact to an existing service, add a new Flow tab (see Working with Flows) and choose Import from the ellipses menu.
  4. Choose Library> Workflows> Traceability Pack> Sending Test Data to External System and click Import.
  5. Click inside the Flow tab to drop the nodes into the service and click Deploy

Advanced Configuration

You can modify the ExternalSystemSettings.properties configuration file located in the <DTP_DATA_DIR>/conf directory to change the default behavior of the integration. DTP's out-of-the-box integration with codeBeamer is configured to use default or commonly-used fields and work item types. If you customized your codeBeamer system, however, then you can configure the following settings to align data in DTP with your custom configuration.  

codeBeamer.import.chunkSize

Specifies the maximum number of test case results sent to codeBeamer in a single run.

Default:

codeBeamer.import.chunkSize=1000 

codeBeamer.tracker.requirements

Specifies the name of a tracker in codeBeamer that contain requirements. This enables you to associate a custom requirements tracker you may have configured in codeBeamer with test results in DTP. The configured tracker also determines which requirements are presented on Traceability report.

Default:

codeBeamer.tracker.requirements=System Requirement Specifications 

"Information" and "Folder" requirement types are ignored in DTP reports. 

Note: This setting is global by default, but it can be used for specific codeBeamer projects by appending the codeBeamer project key to the end of the setting. For example:

codeBeamer.tracker.requirements.Project-A=System Requirement Specifications

Where "Project-A" is the codeBeamer project key, this setting would only apply to that codeBeamer project. Multiple project-specific versions of this setting can be defined using different keys. Project-specific settings take precedence over global settings.

codeBeamer.tracker.tests

Specifies the name of trackers in codeBeamer that contain test cases which you would like to associate with test results in Parasoft DTP. Separate multiple test trackers with a semi-colon ( ; ).

Default:

codeBeamer.tracker.tests=Test Cases

Example:

codeBeamer.tracker.tests=Test Cases;Integration Tests

This setting is used to search for test cases in codeBeamer by the following Parasoft DTP functionality:

    • Sending test results from DTP to codeBeamer using /syncTestCases API.
      • Searches for existing test definitions in codeBeamer are based on @test T_ID defined in Parasoft tools. When T_ID is not found, it is ignored.
      • Searches for existing test definitions are related to requirements (found by Parasoft @req) in codeBeamer.
        • Note: When /syncTestCases API does not find a specific Parasoft test definition (corresponding to Parasoft test annotated with @req) in codeBeamer, it creates a new test definition in codeBeamer. It will create this test definition in the first test tracker defined in the codeBeamer.tracker.tests list.
    • Providing requirements and tests hierarchy to be presented in “Requirements View” in C/C++test.

Note: This setting is global by default, but it can be used for specific codeBeamer projects by appending the codeBeamer project key to the end of the setting. For example:

codeBeamer.tracker.tests.Project-A=<TCT1>;<TCT2>;<TCT3>

Where "Project-A" is the codeBeamer project key, this setting would only apply to that codeBeamer project. Multiple project-specific versions of this setting can be defined using different keys. Project-specific settings take precedence over global settings.

Note about codeBeamer configuration:

When a Parasoft test is annotated with @req, it creates an equivalent test case in codeBeamer and a Verifies relationship between test case and requirement is created. To ensure it works correctly, make sure that the "Verifies" option is enabled for the test cases and requirements trackers in your codeBeamer configuration.

codeBeamer.tracker.testRuns

Specifies the name of test run trackers in codeBeamer where DTP should create test runs when sending test results to codeBeamer. This setting is used when sending test results from DTP to codeBeamer using the /syncTestCases API.

Typical use case: Single Test Run Tracker

In most cases, there is only one test run tracker in codeBeamer and you can leave this setting is empty. This is the default. When this setting is empty, DTP creates test runs in the first (and in this case, the only) test run tracker in codeBeamer.

Advanced use case: Multiple Test Run Trackers

If your codeBeamer projects contains multiple test run trackers into which you want to insert test runs from DTP, you can define them in this setting. Separate multiple test run trackers with a semi-colon ( ; ).

Example:

codeBeamer.tracker.testRuns=<TRT1>;<TRT2>;<TRT3>

When you define multiple test run trackers, it is possible to specify a mapping between codeBeamer test case trackers (see codeBeamer.tracker.tests above) and test run trackers in a given project and control where the test runs will be created. For example, let's say you configure the following settings:

codeBeamer.tracker.tests=<TCT1>;<TCT2>;<TCT3>;<TCT4>

codeBeamer.tracker.testRuns=<TRT1>;<TRT2>;<TRT3>

In that situation, when DTP executes the /sendTestCases API and for a specific test case in DTP it finds a test case in codeBeamer (for example, TCT1), its test run will be created in the corresponding test run tracker (in this case, TRT1). The mapping is based on the respective position of the settings, so in our example TCT2 would map to TRT2 and TCT3 would map to TRT3.

If DTP were to find a test case in codeBeamer defined in TCT4, since there is no TRT4 tracker configured in DTP (or if it is configured in DTP, but TRT4 does not exist in codeBeamer), the test run will be created in TRT1 (since it's the first item in the configured test run tracker list).

Note: This setting is global by default, but it can be used for specific codeBeamer projects by appending the codeBeamer project key to the end of the setting. For example:

codeBeamer.tracker.testRuns.Project-A=<TRT1>;<TRT2>;<TRT3>

Where "Project-A" is the codeBeamer project key, this setting would only apply to that codeBeamer project. Multiple project-specific versions of this setting can be defined using different keys. Project-specific settings take precedence over global settings.

codeBeamer.workItemType.bug.status

Specifies the status of bugs that are created in codeBeamer when creating work items in the DTP Violations Explorer and Test Explorer views.

Default: Open 

codeBeamer.workItemType.bug

Specifies the type of work item to create in codeBeamer when creating new bugs from the DTP Violation Explorer and Test Explorer. This enables you to associate custom bug trackers you may have configured in codeBeamer with work items created from DTP. 

By default, the property is not set. As a result, bug work items created in DTP are associated with bug work items in codeBeamer.

codeBeamer.workItemType.task.status

Specifies the status of tasks that are created in codeBeamer when creating work items in the DTP Violations Explorer and Test Explorer views.

Default: Open 

codeBeamer.workItemType.taskSpecifies the type of work item to create in codeBeamer when creating new tasks from the DTP Violation Explorer and Test Explorer. This enables you to associate custom task trackers you may have configured in codeBeamer with work items created from DTP. 

By default, the property is not set. As a result, task work items created in DTP are associated with task work items in codeBeamer.

codeBeamerIssueUrl

Specifies the URL template for linking work items created in the DTP Violation Explorer and Test Explorer to work items in codeBeamer.

Default:

codeBeamerIssueUrl=<CODEBEAMER_URL>/cb/issue/<ID> 

Usage

After configuring the integration with codeBeamer ALM, developers, testers, and other users can leverage the functionality enabled by the integration.

Manually Creating Bugs and Tasks in CodeBeamer ALM

The Test Explorer and Violations Explorer views enable you to create bugs and tasks for any test and violation, respectively, regardless of status. Refer to the following sections for details on creating codeBeamer assets in explorer views:

Sending Test Data to CodeBeamer 

The following diagram shows how you could implement an automated infrastructure for integrating Parasoft DTP and Parasoft test execution tools into your codeBeamer ALM environment:

  1. Create items in codeBeamer trackers. The items will be associated with tests executed by Parasoft tools. You can create requirements (i.e., items in a System Requirement Specifications tracker) or test cases (i.e., items in a Test Cases tracker), for example.
  2. In your test file, add the codeBeamer test case or requirement IDs using the @test or @req annotation. Refer to your Parasoft tool documentation for details on adding annotations.
    • Use the @test <codeBeamer test ID> annotation to associate tests with items in a codeBeamer Test Cases tracker.
    • Use the @req <codeBeamer System Requirements Specification ID> annotation to associate tests with items in a codeBeamer System Requirements Specifications tracker. Parasoft creates a Verifies relationship between a test case and requirement in codeBeamer, so make sure that the Verifies option is enabled for the specific test case and requirements trackers in your codeBeamer configuration.  
    • You can get the work item ID from various parts of the codeBeamer interface, such as the URL:
       
  3. Execute your tests as part of the CI process. You can also manually execute the tests from the IDE.
  4. As part of the test execution, Parasoft test execution tools will tag the results with the filter and build IDs and send the data to DTP. You can verify the results in DTP by adding Test Widgets to your DTP dashboard and setting the filter and build ID. Developers can download the test execution data from DTP into their IDEs so that they can address any failed tests.
  5. If you deployed the Sending Test Data to External System flow (see Deploying the Sending Test Data to External System Flow), then unit and functional testing results will automatically be sent to codeBeamer when Data Collector receives the data from the Parasoft tool. By default, the flow forwards unit and functional test results that were received by Data Collector for any project, but you can configure the flow to only send data for a specific project (see Sending Results from a Specific DTP Project). 
    You can also manually s
    end a POST request to the DTP REST API endpoint to send results from the DTP database to codeBeamer. Pass the DTP filter and build IDs as URL parameters in the API call:

    curl -X POST -u <username>:<password> "http://<host>:<port>/grs/api/v1.7/linkedApps/configurations/1/syncTestCases?filterId=<filterID>&buildId=<buildID>"

    The following table describes the endpoint parameters:

    ParameterValueDescriptionRequired
    filterId integerSpecifies the filter ID containing the test data. The filter ID is an integer value and should not be confused with the filter name.Required
    buildId stringSpecifies the build ID containing the test data.Required
    groupResultsBySOAtestTST 

    boolean

    Setting to true groups SOAtest results by .tst file. As a result, one .tst file will be associated with one codeBeamer test.

    Setting to false associates each test step within a SOAtest .tst with a codeBeamer test.

    Default is false 

    Optional

    The filter and build IDs are available in the Test Explorer URL:

     

  6. DTP will locate the test results that match the filterId and buildId parameters and send the data to the codeBeamer work items. 

    • When DTP locates results with an @test <ID>, it will search for items in Test Case trackers with a matching ID in codeBeamer and update the item. No action will be taken if the unit test case IDs do not exist in codeBeamer. Use the annotation to associate tests with test executions in CodeBeamer, if necessary.
    • When DTP locates results with an @req <ID>, it will search for items in System Requirements Specifications trackers with a matching ID in codeBeamer. If a match is found, Test Runs will be added to the Test Case associated with the requirement. If there are no Test Cases for the requirement ID, then Test Case will be created and a Test Run will be added.
    • An external-app-sync.log file will also be written to the the <DTP_INSTALL>/logs directory. This log file contains progress information about sending test results from DTP to codeBeamer. 

After DTP processes the report and sends results to codeBeamer, you should expect a response similar to the following:

{
   "createdTestSession": "DTPP-521",
    "created": [
        "DTPP-519, testName = testBagSumAdd"
    ],
    "updated": [
        "DTPP-519, testName = testBagSumAdd",
        "DTPP-518, testName = testBagSimpleAdd"
    ],
    "ignored": [
        "MAGD-567, testName = testBagNegate",
        "QAP-512, testName = testTryThis3",
        "QAP-512, testName = testTryThis4",
        "MAGD-567, testName = testBagMultiply"
    ]
}

Sending Results from a Specific DTP Project

If you are using the Sending Test Data to External System flow to forward unit and functional test results, data will be sent to codeBeamer for all DTP projects by default. As a result, work items will be updated to include the tests collected for any DTP project that contain annotations matching codeBeamer IDs. You can configure the flow, however, to only send data for a specific project. 

  1. Open Extension Designer and open the service where the Sending Test Data is deployed.
  2. Drag a new switch node to the workspace.
  3. Select and delete the connector line between the "DataCollectorProcessedEvent" node and the "Is dynamic analysis" node.
  4. Drag a new connector from the "DataCollectorProcessedEvent" node to the switch node and from the switch node to the "Is dynamic analysis" node.
     
  5. Double-click the node and specify the following string in the Property field:

     event.message.resultsSession.project
  6. Specify the name of the DTP project in the string field.
  7. (Optional) Provide a more descriptive name for the node.
  8. Click Done to finish configuring the node and click Deploy.

When the flow executes, only test results for the specified DTP project will be sent to codeBeamer. 

Viewing Results in CodeBeamer

You will be able to view results in codeBeamer after sending the test data. The following image shows a System Requirement Specifications tracker in codeBeamer. The tracker contains several test cases.

You can drill down into a test case to view details, such as test runs for the test case.

Click on a test run to view execution details, including test file, build ID, and test author.

 

Viewing the Traceability Report

If the External System Traceability Report has been deployed to your system (see Enabling the Requirements Traceability Report), you can add widgets to your dashboard to monitor traceability from requirements to tests, static analysis, code reviews for your project. The widgets also drill down to a report that includes additional details.

Adding and Configuring the Widgets

The widgets will appear in a separate Traceability category when adding widgets to your DTP dashboard. See Adding Widgets for general instructions on adding widgets.

You can configure the following settings:

TitleYou can enter a new title to replace the default title that appears on the dashboard.
FilterChoose Dashboard Settings to use the dashboard filter or choose a filter from the drop-down menu. See Creating and Managing Filters for additional information about filters.
Target BuildThis should be set to the build ID you executed the tests and code analysis under. You can use the build specified in the dashboard settings, the latest build, or a specific build from the drop-down menu. Also see Configuring Dashboard Settings.
TypePie widget only . Choose either a Tests, Violations, or Reviews from the drop-down menu to show a pie chart detailing the status by type. Add instances of the widget configured to each type for a complete overview in your dashboard.
ProjectChoose a codeBeamer project from he drop-down menu.

Requirements Widget

This widget shows the number of requirements from the specified codeBeamer project.

Click on the widget to open the Requirement Traceability report.

Test Coverage Widget

This widget shows the percentage of requirements covered by tests against all requirements in the project.

Click the center of the widget to open the main Requirement Traceability report.

The colored-in segment represents the requirements covered by tests. Click on the segment to open the Requirement Traceability report filtered to the With Tests category.

Pie Widget

Unit testing, functional testing, static analysis, and peer reviews are common activities for verifying that requirements have been properly and thoroughly implemented. This widget shows the overall status of the project requirements in the context of those software quality activities. You can add a widget for each type of quality activity (tests, static analysis violations, reviews) to monitor the progress of requirements implementation for the project.

Mouse over a section of the chart to view details about quality activity type status. Click on the widget to open the Requirement Traceability report filtered by the selected type.

Requirements Implementation Status by Tests

Requirements Implementation Status by Violations

Requirements Implementation by Reviews

Understanding the Requirement Traceability Report

The report lists the codeBeamer requirements and data associated with them.

You can perform the following actions:

  • Disable or enable the Show files/reviews option if you want to hide the Files and Reviews columns in the report. The Files and Reviews columns will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). Disabling the Files and Reviews columns on this screen hides the related tabs in the Requirement Details report.  
  • Click on a link in the Key column to view the tracker in codeBeamer ALM.
  • Click on a link in the Summary column or one of the Test columns to view the test-related information associated with the tracker in the codeBeamer Requirement Details Report.
  • Click on a link in one of the Files columns to view the static analysis-related information associated with the tracker in the codeBeamer Requirement Details Report.
  • Click on a link in one of the Reviews columns to view the change review-related information associated with the tracker in the codeBeamer Requirement Details Report.

Requirement Traceability Report by Type

Clicking on a section of the codeBeamer Requirements - Pie widget opens a version of the report that includes only the quality activity type selected in the widget. You can use the drop-down menus to switch type and status. You can also disable or enable the Show files/reviews option if you want to hide the Files and Reviews columns in the report. The Files and Reviews columns will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). Disabling the Files and Reviews columns on this screen hides the related tabs in the Requirement Details report.  

Understanding the Requirement Details Report

The codeBeamer Requirement Details report provides additional information about the files, static analysis findings, and tests associated with a specific codeBeamer requirement. You can open this report by clicking on a requirement in the codeBeamer Requirement Traceability report.

The first tab shows the results of the tests that were executed to verify the specific work item.

You can click on the View results in Test Explorer link to view all of the tests associated with the work item in the Test Explorer

You can also click on individual test names in the table to view each test in the Test Explorer.

The second tab shows the files associated with the specific requirement, as well as the static analysis violation detected in the files. You can click the link the Violations column to view the violations in the Violations Explorer, which provides additional details about the violations.

This tab will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). If you did not map requirements to files, you can hide this tab by disabling the Show files/reviews option on the main traceability report page and reloading the details report.

If the files include any change reviews or review findings, they will be shown in the third tab with links to view them in the Change Explorer

This tab will only contain data if the requirements have been mapped to source files files (see Enabling the Requirements Traceability Report). If you did not map requirements to files, you can hide this tab by disabling the Show files/reviews option on the main traceability report page and reloading the details report.

  • No labels