The Test Results dashlet tracks UEM visits tagged as tests.
- The application under test must be instrumented with Agents.
- UEM must be configured to capture visits.
- Open the System Profile Preferences dialog box, and select the Test Automation item.
- Select the Enable Browser/UI-Driven Tests checkbox.
- Select the test application and click Configure to automatically configure all required UEM settings. The following settings are automatically configured but can be reverted/adapted manually:
Mark a test as failed
All tests are marked as passing by default. Do the following to manually control the test result.
testRunSuccess. If a metadata variable with id
testRunSuccessexists, tests are marked as failing by default. This metadata value is not configured when configuring UEM Applications for tests.
testRunSuccessdefined, a test is marked as passing if at any point during the visit an Action arrives with the
testRunSuccessexpression value of
true, and navigate to another page to propagate the metadata to the server using an action.
testRunSuccess to any value other than
true has no impact on the test result. You cannot reset the valuee once the test is marked as passing using this expression. For example, assigning a false value has no impact.
Make sure the action following
sessionStorage prefix is used because the value set is likely not picked up by the first action in which the value is set, and must survive until the next action.
See UEM Web Settings - Metadata Capturing for more information.
The typical sequence for a browser test is:
- Open the Browser
- Run the Test
- Close the Browser
Only the following changes are needed in your script are needed to track performance metrics for your tests using AppMon.
- Tag the UEM visit with a test name and (optional) with a test run id.
- End the UEM visit before closing the browser (optional, but recommended for best performance).
Tag the UEM visit as a test
sessionStorage is set after the browser has navigated to the domain in which the test executes. Session storage scopes according to the same-origin-policy, within a browser window/tab.
sessionStorage.DT_TESTNAME = "myTest"; // test run id is optional sessionStorage.DT_TESTRUNID = "12345...";
End the visit before closing the browser (optional)
Test automation data calculates after the visit ends. Because of this, end the visit before closing the browser to speed up test results, otherwise the visit times out at the configured visit timeout.
Allow up to 10 seconds before closing the window after
endVisit invokes to allow invocation propagation from
endVisit() to the AppMon Server.
Measures based on following Metric/Metric Groups are captured during the test executions:
|Methods||Invocation||All custom Method Invocation Measures that return a measure for a PurePath will be included.|
|Web Requests||Number of Requests|
|Web Requests||Bytes Received||Request size capture must be enabled in sensor configuration. See Java Web Service sensor, Servlet sensor, or Web Server sensor for more information.|
|Web Requests||Bytes Sent||Request size capture must be enabled in sensor configuration. See Java Web Service sensor, Servlet sensor, or Web Server sensor for more information.|
|W3C Timings||W3C Resource Count||See W3C Resource Timing Metrics for more information.|
|W3C Timings||W3C Resource Timing||See W3C Resource Timing Metrics for more information.|
|Visits||Failed Actions Count|
All the measures captured by the Test Automation feature must have agent and application splittings enabled.
To do this:
- Open the System Profile Preferences dialog box, and select the Measures item.
- Double-click the required measure to edit it.
- In the Details tab, expand the Measure splitting list and select the Create measure for each application and Create measure for each agent check boxes.
- Save your changes.