UI-driven browser test

Functional tests from software testing frameworks such as Selenium, QuickTest Professional, or SilkTest can be easily turned into Performance Tests. You can track Architectural metrics such as the number of images, JavaScript files, third party requests, database and web services calls for each test executions.

The Test Automation Dashlet tracks UEM visits tagged as tests.

For AppMon 6.5 and later, Browser Tests use the UEM JavaScript Agent to capture the relevant metrics. In earlier versions, the now deprecated Browser Agent was used to capture the data. See AppMon 6.3 documentation for more information about Browser Agent Integration.

Requirements

Configuration

  1. Enable Browser/UI-Driven Tests in the Test Automation Settings.
  2. Select the test application and click Configure to automatically configure all required UEM settings. The following settings are automatically configured but can be reverted/adapted manually:

Mark a test as failed

All tests are marked as passing by default. Do the following to manually control the test result.

  1. Configure an additional JavaScript metadata variable to be captured with the id testRunSuccess. If a metadata variable with id testRunSuccess exists, tests are marked as failing by default. This metadata value is not configured when configuring UEM Applications for tests.

  2. With testRunSuccess defined, a test is marked as passing if at any point during the visit an Action arrives with the testRunSuccess expression value of true. For example, a test may set a JavaScript variable testRunSuccess to true, and navigate to another page to propagate the metadata to the server using an action.

Setting testRunSuccess to any value other than true has no impact on the test result. You cannot reset the valuee once the test is marked as passing using this expression. For example, assigning a false value has no impact.

Make sure the action following testRunSuccess completes normally. You can use the wait.until(ExpectedConditions.elementToBeClickable(By.id("XXX")) Selenium construct to do this. If the action is not complete, it may not be picked up by the JavaScript agent. This this occurs, the test result is not forwarded to AppMon.

The sessionStorage prefix is used because the value set is likely not picked up by the first action in which the value is set, and must survive until the next action.

See UEM Web Settings - Metadata Capturing for more information.

Integration

The typical sequence for a browser test is:

  • Open the Browser
  • Run the Test
  • Close the Browser

Only the following changes are needed in your script are needed to track performance metrics for your tests using AppMon.

  • Tag the UEM visit with a test name and (optional) with a test run id.
  • End the UEM visit before closing the browser (optional, but recommended for best performance).

Tag the UEM visit as a test

Using the metadata capturing feature, the JavaScript agent picks up the test name and test run id. Ensure that sessionStorage is set after the browser has navigated to the domain in which the test executes. Session storage scopeds according to the same-origin-policy, within a browser window/tab.

sessionStorage.DT_TESTNAME = "myTest";
// test run id is optional
sessionStorage.DT_TESTRUNID = "12345...";

Java Example

JavascriptExecutor js = (JavascriptExecutor) driver;
js.executeScript("sessionStorage.DT_TESTNAME = \"myTest\";");
// test run id is optional
js.executeScript("sessionStorage.DT_TESTRUNID = \"12345...\";");

End the visit before closing the browser (optional)

Test automation data calculates after the visit ends. Because of this, end the visit before closing the browser to speed up test results, otherwise the visit times out at the configured visit timeout. Allow up to 10 seconds before closing the window after endVisit invokes to allow invocation propagation from endVisit() to the AppMon Server.

dynaTrace.endVisit();

Java Example

js.executeScript("dynaTrace.endVisit();");

Measures

Measures based on following Metric / Metric Groups are captured during the test executions:

Category Metric Name Notes
Remoting Count
Database Execution Count
Methods Invocation All custom Method Invocation Measures that return a measure for a PurePath will be included
Exceptions Count
Logging Count
Web Requests Number of Requests
Web Requests Bytes Received
Web Requests Bytes Sent
Web Services Count
W3C Timings W3C Resource Count See W3C Resource Timing Metrics for more information.
W3C Timings W3C Resource Timing See W3C Resource Timing Metrics for more information.
Client Errors Count
Visits Action Count
Visits Failed Actions Count

Samples

See https://github.com/Dynatrace/Dynatrace-Test-Automation-Samples/tree/master/selenium.