Page tree

Skip to end of metadata
Go to start of metadata

In this section:


The Test Explorer provides detailed information about unit and functional tests, such as test traces, history of runs, and coverage information. Click on a unit or functional test widget from the Report Center dashboard to access the Test Explorer (see Widgets). You can also access the test explorer from the Coverage Explorer (see Coverage Explorer). The test explorer is divided into four panels:

  1. Search panel; see Using the Search Panel.
  2. Search results table; see Viewing Search Results.
  3. Source panel; Viewing Source Code.
  4. Actions panel; Resolving Test Failures.

Using the Search Panel

The most common workflow is to open the Test Explorer by clicking on a test-related widget in your DTP dashboard, which will populate based on the parameters configured in the widget. You can also modify the search parameters to hone in on specific parameters. 

Click Change Search to open the search dialog and configure your search criteria. The following search criteria are available:


Target Build

A filter and build ID are the minimum criteria for searching tests. By default, the latest build selected when you change the filter, but you can choose a different build from the drop-down menu. See the following sections for additional information: 

Baseline Build


A baseline build is any historical build used for comparison with another build. Choose a baseline build from the drop-down menu to search for tests reported from the baseline build to the build selected with the filter.

You can search for tests with the following states:

  • Pass → Fail - These are tests that passed in the baseline build, but failed in the target build.
  • Fail → Pass - These are tests that failed in the baseline build, but passed in the target build.
  • New - These are tests that did not appear in the baseline build, but have been added in the target build.
  • Missing - These are tests that appeared in the baseline build, but did not appear in the target build.

You can search for tests based on the following statuses:

  • Pass 
  • Fail 
  • Incomplete 
Coverage Image

If one or more coverage images have been configured for the filter, you can choose an image from the drop-down menu to search for tests that have coverage associated with the tests. 

A coverage image is a unique identifier used to aggregate coverage data from runs with the same build ID. See DTP Concepts for additional information about coverage images.

Analysis Type

You can choose one or more of the following test types to include in the search:

  • Unit Test
  • Functional Test
  • Manual Test
  • Other - Refer to the test execution documentation for information about defining your own test types.
Test Environment

You can specify a test environment tag to search for tests that were executed in a specify environment (also see Viewing Tests Executed in Different Environments).

Refer to the test execution tool documentation for details about configuring test environment tags.


You can choose one or more of the following priority levels to include in the search:

  • Critical 
  • High 
  • Medium 
  • Low 
  • Do not show
  • Not defined

You can use the REST API to customize priorities

Include File Pattern

Exclude File Pattern

You can specify Ant patterns to narrow or broaden the scope of your search. 
Risk/ImpactYou can search by one or more risk/impact values. Risk/impact is the extent to which a test impacts the business. Risk/impact can be customized through the REST API.
Reference NumberYou can constrict your search to a specific reference number. Reference numbers can be added manually or automated through the REST API.
AuthorYou can search by one or more code authors. Authorship is determined from the settings in the code analysis tool.
ActionYou can search by one or more assigned actions. Actions can be customized through the REST API.
AssigneeYou can search by one or more assignees.

If DTP is integrated with a third-party ALM system and you have deployed the Traceability Pack your environment, you will be able to access a traceability report that links tests sent to DTP with work items tracked in the ALM system. Clicking on a link to the Test Explorer from the traceability report will include the artifact (work item) ID as a search parameter. The ID only appears for informational purposes. The field will not appear unless the specific traceability path has been taken. You can clear the ID in this field, but you can not use it to search for tests based on artifact ID.

See Integrating with ALM Tools and Traceability Pack for details.

LimitSpecify a limit to the number of tests returned by the search.

Some states disable the limit on test cases returned

The Pass → Fail and Fail → Pass state parameters disable the limit on number of test cases returned in the search results area.

Viewing Search Results

The search results panel shows tests based on the parameters entered in the search overlay.

You can perform the following actions:

  • Click on a row to view additional information in the actions panel; see Resolving Test Failures.
  • Click on a column header to sort results
  • Click and drag a column header to the area labeled Drag a column header and drop it here to group by that column to create grouped views of the search results. By default, the table is grouped by test file name.

Viewing Tests Executed in Different Environments

If your tests execute on different machines, you can add the Test Environment tab to the search results table so that you can view the results for each environment. See Navigating Explorer Views for additional information on filtering the results table.

Tests in the build that have unique configuration identifiers (such as session tags) will appear under in the table with the test environment tag that was applied during test execution.

Refer to DTP Concepts for additional information about session tags, build IDs, and other metadata that enable DTP to organize and aggregate data.

Resolving Test Failures

The actions panel provides information about the tested file and its tests to help you understand defects in your software. The panel also provides an interface for assigning test metadata to help you resolve test failures. Click on a tab in the actions panel to view information or assign tests with metadata to place it into a remediation workflow.

Prioritization Tab

Click on the Prioritization tab to access several actions to help you remediate unit and functional test failures.

You can perform the following actions:

  • Add comments
  • Assign a user by entering a user name in the Assigned To field.
  • Set the priority by choosing an option from the Priority drop-down menu.
  • Define an action for the assignee by choosing an option from the Action drop-down menu.
    • Actions are strings of metadata that you can use to define how you choose to remediate test failures. DTP ships with set of predefined actions: None, Fix, Reassign, Review, Suppress, and Other. You can edit or remove the predefined action types (except for the None type) using the API. For details on configuring actions, choose API Documentation from the Help drop-down menu in the Report Center navigation bar.
  • Associate a business risk or impact to the test by choosing an option in the Risk/Impact drop-down menu.
  • Assign a due data by entering a date in the Due Date field or by using the date picker
  • Assign a reference number to the test be entering a value in the Reference # field

Click Apply to save the test metadata.

Permissions Must be Configured to Update Test Metadata and View Sources

Users must have permissions to set unit and functional test metadata (prioritize), as well as view sources. Permission can be granted for all tests or limited to tests owned by the user. The following table describes a project membership scenario and how permissions may be assigned (see Permissions for additional information):

User TypeAdditional PermissionAccess Granted
  • View sources
  • Prioritize all
  • View sources
  • Prioritize all
  • View sources
  • Prioritize own violations
Non-member 1
No access
Non-member 2project
  • View project data
  • Cannot view sources
  • Cannot prioritize
Non-member 3project, prioritizeOwner
  • Cannot view sources
  • Prioritize own violations
Non-member 4project, viewSources
  • View sources
  • Cannot prioritize

Creating an Issue in Third-party Systems

You can connect a project in DTP to a project in one of the following requirements/issue tracking systems:

The integration enables you to create issues in the connected ALM system from tests in the Prioritization panel.

  1. Select a test in the search results area and click the Prioritization tab in the actions panel.
  2. Click the Create button and specify information about the work item you are creating


    The name of the ALM project in which the new issue will be created appears in the Project field. The association between a DTP project and the external ALM project is defined by your DTP administrator. See the following sections for details:


    Choose the type of item to create from the drop-down menu. Terminology varies across ALMs, but DTP supports the following types of work items by default:

    • For codeBeamer and Jira, choose Bug or Task.
    • For Polarion ALM, choose Issue or Task.
    • For TeamForge, choose Defect or Task.
    • For VersionOne, choose Defect or Issue.
    Title/SummaryBy default, the test name is prepended with "Review Test" and used as the issue title (TeamForge, VersionOne) or summary (codeBeamer ALM, Jira, Polarion ALM), but you can make any necessary changes.
    DescriptionThis field populates the description field in the third-party system. It will include a link back to DTP based on the Display URL field setting in the External Application configuration page.
  3. Click Create.

An issue will be created in your external system that links back to the test in DTP. Additionally, a link to the issue will appear in the Prioritization tab, create a bi-directional path between DTP and your external system.

Modification History Tab

Click on the Modification History tab to show when test metadata was updated. 

If you want to filter for comments, you can enable the Only show comments option.

Test History Tab

Click on the Test History tab to show when tests were ran and the status of each run. You can click on a column header to sort the contents of the table.

Traces Tab

When viewing unit test data, the Traces tab shows stack traces for failed and incomplete tests. If you are viewing functional or manual tests, the tab shows error messages for test failures.

Unit Test View

Click on an entry under an error message to show the test in the source code in the source panel; see Viewing Source Code.

Functional and Manual Test View

The message associated with the selected failure displays in the tab. The specific test failure is highlighted in the source panel.

Click the Show Traffic button to view the data requested during the test and the application’s response.

Viewing Test Coverage Information

The coverage panel shows a hierarchical view of the files associated with the tests. Each node in the hierarchy shows how much of each file or directory is covered in brackets. Coverage information is not currently available for functional tests.

  • Click on a disclosure triangle to navigate the tree. When an item is loaded that does not have any siblings, its children are automatically loaded and displayed. This behavior will cascade until the tree reaches a branch with multiple items at the same level (a decision point) or until the branch cannot be expanded any further. The coverage tree panel will not automatically expand nodes again if it has been previously expanded and collapsed unless the search criteria has changed or the page has been reloaded.

  • Click on file nodes to view it in the source code panel; see Viewing Source Code.

Details Tab

Click the Details tab to view information about the test.

You can also add some details shown in the tab to the search results table. See Navigating Explorer Views for additional information.

Viewing Source Code 

If DTP is integrated with your source control system, then you can view the tested source code in an explorer view. If integration with your source control system has not been configured, you can still view sources tested by Parasoft code analysis and test execution tools (C/C++test, dotTEST, Jtest) by setting the report.dtp.publish.src setting to full or min when configuring the tool. This instructs the tool to transfer client source code to DTP when generating the report. See the documentation for your tools for additional information.

Users must also have permissions to view source code. See the note above about permissions for additional information. 

When viewing functional or manual tests, the source code panel shows a hierarchical view of the test suite data collected. No additional permissions are required to view functional test data. See Viewing Functional and Manual Test Sources for information on viewing functional test data.

Viewing Unit Test Source Code

Click on a file link in the Traces or Coverage tab to load the contents of a source file into the sources panel. This enables you to view test and coverage information in the context of the code. Displayed sources are marked with color coded flags for at-a-glance coverage information after clicking links in the Coverage tab of the actions panel. Lines with green flags are covered. Lines with red flags are uncovered.

Clicking on links in the Traces tab also loads sources into the sources panel. Lines of code where the fail occurred are flagged so you can easily find the test in the trace stack.

When you make a selection in the tests table, the file name and the component that opened the file appears in the code panel.

Viewing Functional and Manual Test Sources

Click on a test in the search results panel to view functional tests performed with SOAtest 9.x or later in the Test Explorer. SOAtest 9.9.2 or later is required to view test authorship and parameterized tests. Data Collector parses the XML report file generated by SOAtest and displays the .tst file data, so the .tst file does not need to be published to DTP or stored in source control.

The selected test suite is highlighted in the panel. You can click on the disclosure triangles to navigate the test data. Each node shows how much of the test suite was executed successfully. Failed test suites are displayed in red.

  • No labels