In this guide:
Requirements describe functional aspects of an application, such as the effect of clicking a button in the UI, as well as non-functional characteristics, such as the application's resilience to cyber attacks. Requirements are commonly created and tracked in application lifecycle management (ALM) or requirement management systems (RMS). In some ALM/RMSs, "requirements" are just one type of "work item" for tracking the application under development.
Developers and QA engineers create unit tests and functional tests to verify and validate that the software meets the requirements. But tracing tests to the requirements that they are intended to verify is challenging, however, due to the complexity, speed, and scale of modern applications. Organizations that commit to tracing requirements to tests and vice versa often rely on Excel spreadsheets with manually-populated test execution results. In some industries, such as defense, automotive, and medical devices, any code that affects the safety and security of the application must be tested according to industry standards and traced to the requirements that define the safety and security levels.
Parasoft's traceability functionality solves these challenges by enabling you to automate test and requirements tracing as part of your test execution workflow. Parasoft DTP also enables you to quickly generate traceability reports that satisfy regulatory compliance requirements. This guide describes how to configure Parasoft products to achieve requirements traceability.
This guide is intended for people who perform the following roles:
Most of the functionality that enables traceability is performed by Parasoft DTP. The Traceability Pack for DTP must be installed to generate the traceability dashboard widgets and reports.
The following table describes which Parasoft tools work with the supported ALM/RMSs.
codeBeamer | Jama Connect | Jira 1 | Polarion 2 | TeamForge | VersionOne | |
---|---|---|---|---|---|---|
C/C++test Professional | X 3 | X | X 3 | X 3 | X | X |
dotTEST | X | X | X | X | X | X |
Jtest | X | X | X | X | X | X |
Selenic | X | X | X | X | X | X |
SOAtest | X 3 | X | X 3 | X 3 | X | X |
1 The Xray extension is required for full functionality.
2 Additional integration files must be installed in DTP. Refer to the DTP documentation for details.
3 The Requirements View, which is an additional interface for interacting with ALM/RMS requirements in the desktop, is only supported for these systems.
In this guide, we assume that your Parasoft test execution tool is configured to send results to DTP. The following procedure provides an overview of the traceability workflow:
Refer to the documentation for your ALM/RMS for instructions on how to create artifacts that you want to link to your tests.
Parasoft's out-of-the-box integrations are configured to work with default or commonly-used fields and work item types, but you can configure the ExternalSystemSettings.properties configuration file located in the <DTP_DATA_DIR>/conf directory to change the default behavior of the integration. This enables you to map tests and issues created in the ALM/RMS from the DTP interface to custom work items. Refer to the documentation for each integration for additional details.
This guide covers the default configuration.
Create a link between your tests and the requirements stored in the ALM/RMS. You can create the association by adding special annotations to your unit tests and specifying the requirement or work item IDs that you want to link. You can use the dedicated GUI in SOAtest to create the link between requirements/work items and functional tests. The dedicated GUI for creating test associations is also available in C/C++test Professional.
Several tags are available for annotating tests, but on the req
and test
tags are supported for requirements and test traceability.
Before you can use the annotations, you must enable the report.associations property in the <tool>.properties configuration file:
report.associations=true |
Open your test files and create the associations using the following syntax:
@<tag> <REQUIREMENT_ID> |
In the following example, the test class is associated with a requirement in the ALM/RMS with the ID PROJECT-123 and a test within the class is associated with work item ID PROJECT-456:
/** * @req PROJECT-123 */ public class Test { testSomething1() { ... } /** * @test PROJECT-456 */ testSomething2() { ... } } |
You can use the same association mechanism and syntax as Jtest and dotTEST to manually associate test suites with requirements. You can also use the Requirements View to automatically import the requirements from supported ALM/RMSs into the IDE so that you can use the C/C++test GUI to associate tests (see Supported Environments). Refer to your C/C+test documentation for complete information about the Requirements View.
Depending on your RMS, a requirement may include one or more sub-modules called "test definitions". All work items (requirements and their test definitions, if available) are arranged as nodes in a tree, which can be collapsed or expanded.
Right-click a requirement in the Requirements View and choose Copy ID to copy its ID to your clipboard. This helps you correlate a requirement or test definition with test cases by enabling you to paste the ID in your code to create an annotation as described in Jtest and dotTEST.
You can use the Requirements View to automatically import the requirements from supported ALM/RMSs into the IDE (see Supported Environments). You can use the Requirements View to import requirements from a ReqIf file. Alternatively, the Requirements and Notes tab of SOAtest's test suite configuration panel enables you to annotate tests using the same syntax as described in Jtest and dotTEST.
The Parasoft Annotations dependency must be added to your project so that you can add annotations to your Selenium tests and associate them with work items (requirements) stored in your ALM/RMS. If you used Selenic's project creation functionality, then the dependency will already be added. For existing projects, add the following dependency to your project pom.xml file:
<dependency> <groupId>com.parasoft</groupId> <artifactId>parasoft-annotations</artifactId> <scope>test</scope> <version>1.0</version> </dependency> |
For full details about the dependency, refer to the Parasoft Annotations project on GitHub, which includes the JavaDoc API documentation.
Use the @WorkItem
annotation in your test methods and classes using the following syntax to correlate the test with work items in your ALM.
@WorkItem(type=<type>, id="<ID>", url="<URL>") |
Annotations include the ID of the work item, type of work item (requirements, in this guide), and a URL to the work item in the ALM (optional).
In the following example, the function is associated with a requirement ID REQ-123:
@WorkItem(type=REQ, id="REQ-123, url="https://server.myALM.com/") public void myFunction() { . . . } |
A connection to the ALM/RMS must be configured in DTP in order to send and retrieve data from the system. To generate traceability reports and other visualizations, the Traceability Pack must also be installed in DTP and the visualization tools deployed to the DTP environment. In this guide, we assume that you have correctly configured filters in DTP to match the incoming data from your Parasoft tools.
jira.issueType.requirement
setting (see Creating Requirements and Work Items).Configuring the project association enables you to manually create issues from the DTP Violations or Test Explorer views that are linked to the ALM/RMS project. It also enables DTP to send data to the correct project when syncing test results (see Sending Results to ALM/RMS).
You can associate multiple projects in DTP with an ALM/RMS project, but you cannot associate the same DTP project with more than one external project.
You can remove an association by clicking the trash icon. If an association is deleted and recreated later, existing links between violations and external issues will be reactivated.
You can reconfigure an existing association between DTP and external projects:
The Parasoft Traceability Pack is a set of artifacts for your DTP infrastructure that help you establish and demonstrate traceability between tests, code analysis, and peer reviews.
The Traceability Pack artifacts will be uploaded to DTP, but they will need to be deployed before you can use them.
After installing the pack, you will need to deploy the artifacts to your DTP environment.
Deploying the External System Traceability Report adds new widgets to Report Center, as well as a drill-down report.
This artifact automatically sends test data to your ALM/RMS when DTP Data Collector retrieves test results from a Parasoft tool. If you prefer to manually trigger this process or to automate it as part of your CI/CD pipeline, you can call the REST API endpoint as described in Sending Results to ALM/RMS.
If you want to see static analysis and code review data in the traceability reports generated by DTP, you can also link analyzed source code files with requirements/work items. Doing so will add static analysis and code review tabs to the traceability details report, providing additional software quality or security information.
The following workflow describes how to enable requirements traceability for a specific build ID. You must repeat the steps for each additional build ID you want to include the traceability report.
Create a CSV file that maps the source code files for your project to the requirement IDs, e.g.:
File | Associated Requirement ID |
---|---|
Project-A/src/foo/goo.java | REQ-A |
Project-A/src/foo2/goo.c | REQ-A, REQ-B, REQ-C |
artifactsTypes
DTP REST API endpoint. Refer to the DTP API documentation for usage. You can access the API documentation from the DTP help link. In both cases, the build ID and filter ID must be specified.Scan the CSV file by running the fileReqAssoc.groovy script shipped with DTP in the <DTP_INSTALL>/grs/extras/traceability directory. Specify the CSV file, build ID, DTP host and protocol, and DTP login credentials:
groovy fileReqAssoc.groovy -csv <CSV_FILE_NAME> -build <BUILD_ID> -dtp <DTP_HOST_INC_PROTOCOL> -user <DTP_USERNAME> -password <DTP_PASSWORD> |
After the script finishes, the traceability data for the specific filter ID and build ID will be stored in the database.
Execute the tests that you associated with the requirements (see Associating Tests and Code with Requirements). Refer to the documentation for your tool for details on how to execute tests. When the results are reported to DTP, you can use the special dashboard widgets and reports to monitor requirements coverage.
If you deployed the Sending Test Data to External System flow to DTP, then this step will be performed automatically when DTP receives the test results. Otherwise, you can manually call the REST API endpoint in DTP to send test data to your ALM/RMS.
Resource | /syncTestCases |
---|---|
URL | <PROTOCOL>://<DTP_HOST>:<DTP_PORT>/grs/api/v1.7/linkedApps/configurations/1/syncTestCases |
Method | POST |
Parameters |
|
curl -X POST -u <username>:<password> "http://<host>:<port>/grs/api/v1.7/linkedApps/configurations/1/syncTestCases?filterId=<filterID>&buildId=<buildID>" |
You will be able to view results in your ALM/RMS after sending the test data. Refer to the documentation for details about viewing work items in your system. The following example shows a Jira story that contains several tests:
In most systems, you will be able to click on a test and drill-down to additional test execution details the include build and authorship information, as well as links back to the test or violation in DTP. The following example shows test details in TeamForge:
Add the traceability widgets to your dashboard, which provide requirements coverage information, as well as link to the traceability report. The following widgets are available:
Clicking on any widget opens the traceability report (see Traceability Report).
The traceability report provides an overview of your project's requirements coverage, as well as serves as the requirements traceability matrix (RTM) for auditing purposes. Click on any traceability widget to open the report. Some widgets can be configured to filter the data. As a result, clicking on the widget will open a pre-filtered report matching the data presented in the widget. Refer to the Parasoft DTP documentation for your ALM/RMS integration for additional information about viewing the traceability reports.
The following example shows the main requirement traceability report in an instance of DTP connected to Jira:
You can click on a link in the Key column to drill down to the details report for individual requirements. The following example shows a details report for a requirement stored in VersionOne:
The details report shows the results of the tests that were executed to verify the specific work item. If you associated files with requirements/work items (see Including Static Analysis and Code Reviews), then the detail report will include tabs for static analysis and code reviews. The following examples show static analysis and reviews as part of a codeBeamer traceability details report.
The Requirements View, which is available for the C/C++test Professional and SOAtest desktops, enables you to import requirements/work items stored in the ALM/RMS you're your IDE so that you can readily associate the requirements with tests in your workspace. The tool actually imports the data from Parasoft DTP, which with pulls the information from the ALM/RMS. The Requirements View is not currently available in Parasoft dotTEST, Jtest, and Selenic.
Refer to the following sections for instructions on importing requirements for your tool:
After importing requirements, click the refresh icon on the Requirements view toolbar to scan for test cases as you work on your project.
You can also enable the Auto detect test cases option from the Requirements view toolbar menu to enable the automatic detection mode. As a result, the tool will automatically search for correlations as you make updates.
When the scanning process completes, the detected test cases are matched with the corresponding work item in the Requirements view. You can right-click a test case and choose one of the following options:
Follow the procedures described in the following sections to cover the requirements with any new tests you create for the project: