In this section:
Overview
The Test Explorer provides detailed information about unit and functional tests, such as test traces, history of runs, and coverage information. Click on a unit or functional test widget from the Report Center dashboard to access the Test Explorer (see Widgets). You can also access the test explorer from the Coverage Explorer (see Coverage Explorer). The test explorer is divided into four panels:
- Search panel - see Using the Search Panel
- Search results table - see Viewing Search Results
- Source panel - see Viewing Source Code
- Actions panel - see Resolving Test Failures
Using the Search Panel
The most common workflow is to open the Test Explorer by clicking on a test-related widget in your DTP dashboard, which will populate based on the parameters configured in the widget. You can also modify the search parameters to home in on specific parameters.
Click Change Search to open the search dialog and configure your search criteria. The following search criteria are available:
Filter Target Build | A filter and build ID are the minimum criteria for searching tests. By default, the latest build selected when you change the filter, but you can choose a different build from the menu. See the following sections for additional information: |
---|---|
Baseline Build State | A baseline build is any historical build used for comparison with another build. Choose a baseline build from the menu to search for tests reported from the baseline build to the build selected with the filter. You can search for tests with the following states:
|
Status | You can search for tests based on the following statuses:
|
Coverage Image | If one or more coverage images have been configured for the filter, you can choose an image from the menu to search for tests that have coverage associated with the tests. A coverage image is a unique identifier used to aggregate coverage data from runs with the same build ID. See DTP Concepts for additional information about coverage images. |
Analysis Type | You can choose one or more of the following test types to include in the search:
|
Test Environment | You can specify a test environment tag to search for tests that were executed in a specify environment (also see Viewing Tests Executed in Different Environments). Refer to the test execution tool documentation for details about configuring test environment tags. |
Priority | You can choose one or more of the following priority levels to include in the search:
You can use the REST API to customize priorities. |
Include File Pattern Exclude File Pattern | You can specify Ant-style patterns to narrow or broaden the scope of your search. See Using File Patterns in DTP for more information about configuring file patterns. |
Risk/Impact | You can search by one or more risk/impact values. Risk/impact is the extent to which a test impacts the business. Risk/impact can be customized through the REST API. |
Reference Number | You can constrict your search to a specific reference number. Reference numbers can be added manually or automated through the REST API. |
Author | You can search by one or more code authors. Authorship is determined from the settings in the code analysis tool. |
Action | You can search by one or more assigned actions. Actions can be customized through the REST API. |
Assignee | You can search by one or more assignees. |
Artifact | If DTP is integrated with a third-party ALM system and you have deployed the Traceability Pack your environment, you will be able to access a traceability report that links tests sent to DTP with work items tracked in the ALM system. Clicking on a link to the Test Explorer from the traceability report will include the artifact (work item) ID as a search parameter. The ID only appears for informational purposes. The field will not appear unless the specific traceability path has been taken. You can clear the ID in this field, but you cannot use it to search for tests based on artifact ID. See Integrating with External Systems and Traceability Pack for details. |
Limit | Specify a limit to the number of tests returned by the search. |
Some states disable the limit on test cases returned
The Pass → Fail and Fail → Pass state parameters disable the limit on number of test cases returned in the search results area.
Viewing Search Results
The search results panel shows tests based on the parameters entered in the search overlay.
You can perform the following actions:
- Click on a row to view additional information in the actions panel; see Resolving Test Failures.
- Click on a column header to sort results.
- Click and drag a column header to the area labeled Drag a column header and drop it here to group by that column to create grouped views of the search results. By default, the table is grouped by test file name.
Viewing Tests Executed in Different Environments
If your tests execute on different machines, you can add the Test Environment tab to the search results table so that you can view the results for each environment. See Navigating Explorer Views for additional information on filtering the results table.
Tests in the build that have unique configuration identifiers (such as session tags) will appear under in the table with the test environment tag that was applied during test execution.
Refer to DTP Concepts for additional information about session tags, build IDs, and other metadata that enable DTP to organize and aggregate data.
Resolving Test Failures
The actions panel provides information about the tested file and its tests to help you understand defects in your software. The panel also provides an interface for assigning test metadata to help you resolve test failures. Click on a tab in the actions panel to view information or assign tests with metadata to place it into a remediation workflow.
Prioritization Tab
Click on the Prioritization tab to access several actions to help you remediate unit and functional test failures.
You can perform the following actions:
- Add comments.
- Assign a user by entering a username in the Assigned To field.
- Set the priority by choosing an option from the Priority menu.
- Define an action for the assignee by choosing an option from the Action menu.
- Actions are strings of metadata that you can use to define how you choose to remediate test failures. DTP ships with set of predefined actions: None, Fix, Reassign, Review, Suppress, and Other. You can edit or remove the predefined action types (except for the None type) using the API. For details on configuring actions, go to Help > API Documentation in the Report Center navigation bar.
- Associate a business risk or impact to the test by choosing an option in the Risk/Impact menu.
- Assign a due data by entering a date in the Due Date field or by using the date picker.
- Assign a reference number to the test be entering a value in the Reference # field.
Click Apply to save the test metadata.
Permissions Must be Configured to Update Test Metadata and View Sources
Users must have permissions to set unit and functional test metadata (prioritize), as well as view sources. Permission can be granted for all tests or limited to tests owned by the user. The following table describes a project membership scenario and how permissions may be assigned (see Built-in Permissions and Groups for additional information):
User Type Additional Permission Access Granted Admin Leader Member Non-member 1 No access Non-member 2 project Non-member 3 project, prioritizeOwner Non-member 4 project, viewSources
Creating an Issue in Third-party Systems
You can connect a project in DTP to a project in one of the following requirements/issue tracking systems:
- Azure DevOps - see Integrating with Azure DevOps
- CodeBeamer ALM - see Integrating with Codebeamer ALM
- Jama Connect - see Integrating with Jama Connect
- Jira - see Integrating with Jira
- Polarion ALM - see Integrating with Polarion ALM
The integration enables you to create issues in the connected ALM system from tests in the Prioritization panel.
- Select a test in the search results area and click the Prioritization tab in the actions panel.
- Click Create and specify information about the work item you are creating:
Project The name of the ALM project in which the new issue will be created appears in the Project field. The association between a DTP project and the external ALM project is defined by your DTP administrator. See the following sections for details:
Type Choose the type of item to create from the menu. Terminology varies across ALMs, but DTP supports the following types of work items by default:
- For Azure DevOps, Jira, or Codebeamer, choose Bug or Task.
- For Jama Connect, choose Defect.
- For Polarion ALM, choose Issue or Task.
Summary By default, the test name is prepended with "Review Test" and used as the summary, but you can make any necessary changes. Description This field populates the description field in the third-party system. It will include a link back to DTP based on the Display URL field setting in the External Application configuration page. - Click Create.
An issue will be created in your external system that links back to the test in DTP. Additionally, a link to the issue will appear in the Prioritization tab, create a bi-directional path between DTP and your external system.
Unlinking a Created Issue
Once an issue has been created, it will be linked to the violation. You can unlink the issue from the violation by clicking the "X" icon beside it.
Only the link will be removed; the issue in the third-party system will not be affected.
Modification History Tab
Click on the Modification History tab to show when test metadata was updated.
If you want to filter for comments, you can enable the Only show comments option.
Test History Tab
Click on the Test History tab to show when tests were run and the status of each run. You can click on a column header to sort the contents of the table.
Traces Tab
When viewing unit test data, the Traces tab shows stack traces for failed and incomplete tests. If you are viewing functional or manual tests, the tab shows error messages for test failures.
Unit Test View
Click on an entry under an error message to show the test in the source code in the source panel; see Viewing Source Code.
Functional and Manual Test View
The failed test is highlighted in the source panel. The messages associated with the test failure appear in the Traces tab.
Click Show Traffic for the desired message to view the data requested during the test and the application’s response.
Viewing Test Coverage Information
The coverage panel shows a hierarchical view of the files associated with the tests. Each node in the hierarchy shows how much of each file or directory is covered in brackets. Coverage information is not currently available for functional tests.
Click on a disclosure triangle to navigate the tree. When an item is loaded that does not have any siblings, its children are automatically loaded and displayed. This behavior will cascade until the tree reaches a branch with multiple items at the same level (a decision point) or until the branch cannot be expanded any further. The coverage tree panel will not automatically expand nodes again if it has been previously expanded and collapsed unless the search criteria has changed, or the page has been reloaded.
- Click on file nodes to view it in the source code panel. See Viewing Source Code for more information.
Details Tab
Click the Details tab to view information about the test.
You can click an associated artifact to view it in the external system integrated with DTP (note: this functionality is not currently supported for DOORS Next). You can also add some details shown in the tab to the search results table. See Navigating Explorer Views for additional information.
Viewing Source Code
If DTP is integrated with your source control system, then you can view the tested source code in an explorer view. If integration with your source control system has not been configured, you can still view sources tested by Parasoft code analysis and test execution tools (C/C++test, dotTEST, Jtest) by setting the report.dtp.publish.src
setting to full
or min
when configuring the tool. This instructs the tool to transfer client source code to DTP when generating the report. See the documentation for your tools for additional information.
Users must also have permissions to view source code. See the note above about permissions for additional information.
When viewing functional or manual tests, the source code panel shows a hierarchical view of the test suite data collected. No additional permissions are required to view functional test data. See Viewing Functional and Manual Test Sources for information on viewing functional test data.
Viewing Unit Test Source Code
Click on a file link in the Traces or Coverage tab to load the contents of a source file into the sources panel. This enables you to view test and coverage information in the context of the code. Displayed sources are marked with color coded flags for at-a-glance coverage information after clicking links in the Coverage tab of the actions panel. Lines with green flags are covered. Lines with red flags are uncovered.
Clicking on links in the Traces tab also loads sources into the sources panel. Lines of code where the failure occurred are flagged so you can easily find the test in the trace stack.
When you make a selection in the tests table, the file name and the component that opened the file appears in the code panel.
Viewing Functional and Manual Test Sources
Click on a test in the search results panel to view functional tests performed with SOAtest 9.x or later in the Test Explorer. SOAtest 9.9.2 or later is required to view test authorship and parameterized tests. Data Collector parses the XML report file generated by SOAtest and displays the .tst file data, so the .tst file does not need to be published to DTP or stored in source control.
The selected test suite is highlighted in the panel. You can click on the disclosure triangles to navigate the test data. Each node shows how much of the test suite was executed successfully. Failed test suites are displayed in red.