Introduction
If, after you have created Smart API tests from traffic and configured Smart API test generation, you are still having trouble with your tests, the information below may prove useful in troubleshooting the problem.
Several of the troubleshooting steps involve configuring settings in the tst_creation.properties file located in the /TestAssets/test_templates
directory of your workspace. The Parasoft360 training course “SOAtest Essentials - Recorder - SMART” has a detailed walkthrough of configuring these settings.
Authentication / Authorization
Smart API Test Generator may not properly configure test steps related to authentication/authorization. Authorization headers often require a user’s attention and configuration after the test suite has been created by Smart API Test Generator. If the test suite fails early on in its steps, and you are seeing “401 Unauthorized” responses, this points to a problem with how the test suite is configured to handle authentication/authorization.
Does the application redirect to an external login page?
This may point to OAuth2, or a similar mechanism, where an external entity provider is handling authentication/authorization for your application under test. Generally speaking, this can mean there is a token that is produced and sent back on redirect to your application that your application then uses when it makes API calls.
You will want to research the technical details (usually HTTP traffic) for how your application handles login. There may be a combination of missing requests or missing/misconfigured parameterizations that result in the test client's login not including the correct data to be authenticated. See the following Troubleshooting Steps section for detailed steps to try.
If your application is indeed protected with OAuth2, it is important to research what grant types are supported with the frontend APIs.
For details about setting up OAuth2 Authorization in SOAtest, see OAuth Authentication.
If this authentication is common to many of your test scenarios, you can consider using Smart Test Templates to auto-configure the proper OAuth2 steps with test suites produced by Smart API Test Generator. See Using Smart Test Templates in OAuth 2.0 Client Credential Flows for more information.
Other
If the login page is integrated within the application itself, there still may be cases where Smart API Test Generator is not configuring the test suite appropriately. Researching how the browser handles authentication to the frontend APIs, whether it’s through cookies, an Authorization header, or other methods, plays an important role in discovering what configuration changes are required to the test suite so that it properly replays the traffic performed by the UI during login.
Cookies
Many applications require specific cookies to maintain session information, user preferences, or authentication status. If the initial HTTP requests do not retrieve the necessary cookies, it can lead to failures in subsequent calls. An early failure in a test suite often points to a missing HTTP request that SOAtest is not sending, but your application UI is. See the following Troubleshooting Steps section for detailed troubleshooting steps.
Missing/Incorrect HTTP Header
If your application APIs expect a specific security header with API clients, there might be a combination of missing requests or missing/misconfigured parameterizations. Knowing the expected HTTP traffic that will result in authenticated API clients is crucial in troubleshooting these cases. Analyzing the raw traffic file from recording is a good first step as this will contain the exact HTTP traffic sent by your browser while exercising the end-to-end flow via the UI. See the following Troubleshooting Steps section for detailed steps to try.
Test Failures / Bad Outcome
If your test suite is getting past the authentication/authorization steps and your application is successfully returning data from its functional APIs to your test clients, then that eliminates one class of common configuration problems. However, you may still be experiencing failing test steps, or the desired outcome of the test may not be manifesting in the application when you go check after running the test suite.
There could be several issues occurring simultaneously. You might have an important request missing from the test suite, or data banks failing to extract data or extracting the wrong data, or there could be hardcoded data that did not get data banked. These problems can lead to cascading issues in subsequent test steps. See the following Troubleshooting Steps section for detailed steps to try.
Test Suite Was Working - Now Failing / Bad Outcome
Has there been a new build deployed in the test environment? It is likely that the business logic of the scenario you are testing has changed in some way at the API layer. You may need to re-record the UI flow of the use case to capture new API traffic and create a test suite that reflects the latest behavior of the UI interactions with the experience APIs.
If the UI tests are automated, the recording process can be injected into your UI test framework using the Smart API Test Generator REST API.
See the following example repo that uses Java Selenium: Selenium Recorder API Example.
Troubleshooting Steps
Missing Requests
A good troubleshooting step is to review the raw traffic file from recording to see if there are any HTTP requests missing in the SOAtest test suite that are present in the traffic file. After the TST file is created from recording, your SOAtest workspace will contain the raw traffic file under /TestAssets/users/<USERNAME>/recorded_traffic
.
See Test Creation Properties for more information about working with the tst_creation.properties file.
By default, Smart API Test Generator filters out HTTP traffic of content types other than application/json
and application/x-www-form-urlencoded
. You may need to adjust the tst_creation.properties file so that it properly filters the HTTP traffic from recording. For example, there may be a different content-type relevant to the flow of HTTP calls necessary for your test suite to function correctly; you can add it to the includeContentTypes
property. Additionally, if your application server does not return a content-type with its response, it may also be filtered out. For cases like this, consider working with the includeURLPatterns
and excludeURLPatterns
properties.
This filtering is an important step in the configuration of Smart API Test Generator because some of the browser traffic is extraneous to testing the API layer of your application (for example, HTML, CSS, JS, PNG, and other such content) and the default settings may be filtering out too much.
Data Bank Failure
This means that a data bank was configured with a particular XPath and it is unable to locate the element from the response received by the SOAtest client.
A good troubleshooting step is to ensure the disableDiffCreation
property is set to false
in your tst_creation.properties file. This will create Diff tools on each test step where the live response will be compared to an expected response based on the originally recorded HTTP traffic.
See Test Creation Properties for more information about working with the tst_creation.properties file.
Start troubleshooting with the first failing test in the test suite. Check that the response payload looks correct or valid. If there is a major difference in the actual versus expected response, such as receiving an error, then this particular test step may be misconfigured, or the issue might originate from an earlier test step. It is common to see a chaining effect of data throughout the test clients in a test suite created by Smart API Test Generator. If the same data element exists at multiple points in the sequence of API responses, Smart API Test Generator prefers re-extracting it at each response rather than using the original test step where the data first appeared.
Review the configuration of the failing test client, paying special attention to request parameters that are parameterized to data bank variables referencing preceding test steps. This can give you clues to trace back to an earlier test step that may be the origin of where the scenario is starting to fail, even if that preceding test step is passing.
If the response payload looks correct and is similar to the expected response, then analyze the XPath configuration of the data bank to see if the structure of the JSON has varied at all from the originally recorded response captured in the Diff tool (or by examining the raw traffic file). Is the expected data value present in the response, but in a different order or within an array? It may be necessary to improve the XPath of the data bank extraction by incorporating XPath conditions and functions so it more effectively locates the data element.
Data Bank Extracting Wrong Data
This may happen when the data to be extracted is contained in an array. Smart API Test Generator may be generating an XPath that does not handle how the response changes with subsequent runs of the test scenario, and this can be exposed with data elements contained within an array. You will want to improve the XPath by incorporating XPath functions or parameterizing the XPath itself, so that it reliably selects the correct index of the array for data extraction.
Parameterized Data That Should Be Hard Coded
There may be cases where Smart API Test Generator is overzealously parameterizing data that should remain hard coded. For example, if a response payload includes some data that coincidentally matches a following request parameter, but those data elements are unrelated to one another, it could lead to test suite failures when the data changes upon re-running the tests. To address this, you can use the requestPayloadParameterizationExcludeNames
property in the tst_creation.properties file.
See Test Creation Properties for more information about working with the tst_creation.properties file.
Hard-Coded Data That Should Be Parameterized
There may be cases where Smart API Test Generator hard coded some data that needs to be parameterized for a successful outcome with the test suite. For example, this could happen if the content-type of the request is not supported for parameterization, like multipart/form-data. In these cases, you will need to manually parameterize the hard-coded value with the proper dynamic data that has been extracted from an earlier test step.
There may be a case where important dynamic data is only available on an HTML page (it cannot be found from API traffic and only exists as hard-coded data in HTML). This data might be required for the request parameters in subsequent API test clients. For this situation, the tst_creation.properties file has settings that can be configured so that Smart API Test Generator will search for such data to be data banked, namely:
includeTextResponseContentTypes
: set this property to the content-type of response traffic that contains the relevant data (for example,text/html
).includeTextResponseValues
: set this property to a regular expression that matches uniquely identifying the data to be found in the text response.
SOAtest will create a Text Data Bank using regular expressions for left and right text boundaries to extract such dynamic data to be used in subsequent API test clients.
See Test Creation Properties for more information about working with the tst_creation.properties file.
Failing Assertions
If you have configured Smart API Test Generator to create Assertions, it will identify similar responses (for example, repeated requests to the same resource) and then analyze how the response data has changed as candidates for assertions. You should review that these assertions are doing what you expect. Similar to the data bank and parameterization troubleshooting, it might generate an inadequate XPath for the data being asserted. In such cases, consider incorporating XPath conditions and functions. You may also try applying a different type of assertion (for example, regular expression or range assertion) for data that is expected to change over multiple runs.
There's also the possibility that it may assert on data that is not significant for your tests. In this case, consider working with the assertorToolIgnore
properties in the tst_creation.properties file. The best practice is to review these assertions and ensure they are detecting unexpected changes that could highlight a regression in your application.
See Test Creation Properties for more information about working with the tst_creation.properties file.