In the first and second post of this series I discussed benchmarking and different approaches towards optimization of web application. As already mentioned in the last post, as soon as you get used to working with Key Performance Indicators you will want to automatically monitor them.

ShowSlow is a great example of how you can track performance metrics over time. dynaTrace also sends performance data to the public ShowSlow instance on a daily basis. You can read more on how we test about 3000 URLs every day.

Beyond providing information to ShowSlow the major motivation for implementing these automated tests was to find out how complex it is to build an automated performance test environment. Honestly we struggled a bit in the beginning. The main challenge was to get the tests run stable and unattended. We experienced some problems with tests that simply got stuck during execution. However by implementing a watchdog that monitors execution and – if necessary – restarts the test we got our tests stable.

The other challenge was to make the effort for making a regular web test a performance as low as possible and get easy access to performance data. We already had all the detailed data in the dynaTrace session. What was missing however was even easier access to high level KPIs like those shown on ShowSlow. Therefore we implemented additional functionality in dynaTrace Ajax Edition which automatically sends JSON data to a REST endpoint.

The beacon contains all major metrics you might want to track for your web application, like timings, grades and content size information. Summing it up it provides a pretty good overview of how your performance changes over time. Below you can see a sample dynaTrace beacon.

{
"version":"2.0.0",
"url": "www.mydomain.com",
"rank":95,
"ranks":{
"cache": {"rank":100, "comment":"" },
"net": {"rank":98, "comment":"" },
"server": {"rank":94, "comment":"2 dynamic server-requests" },
"js": {"rank":89, "comment":"4 slow JS handlers"}
},
"timetoimpression":1587,
"timetoonload":1645,
"timetofullload":2747,
"reqnumber":9,
"xhrnumber":0,
"pagesize":264562,
"cachablesize":0,
"noncachablesize": 264562,
"timeonnetwork": 400,
"timeinjs": 30,
"timeinrendering":200
}

In order to automatically send beacon data you have to configure a couple of parameters. First you have to enable automatic beacon upload and specify an endpoint where to send it to. Both parameters have to be defined in the dtajax.ini file. A third option enables you to automatically open a portal page directly from a dynaTrace Ajax Edition session. After setting this parameters you are ready to use your own ShowSlow instance

-Dcom.dynatrace.diagnostics.ajax.beacon.uploadurl=http://localhost:8080/beaconstorage/endpoint
-Dcom.dynatrace.diagnostics.ajax.beacon.portalurl=http://localhost:8080/beaconstorage/endpoint
-Dcom.dynatrace.diagnostics.ajax.beacon.autoupload=true

We already blogged about how to automatically start the dynaTrace agent from Watir or Selenium. In fact it is nothing more than setting two environment variables.

set DT_IE_AGENT_ACTIVE=true
set DT_IE_SESSION_NAME=MySession

As you can see it is really easy to convert your functional Selenium tests into performance tests. Most of these changes won’t take more than a couple of minutes. Then you have enriched your functional tests with performance data. I am convinced we could not make it more simple …

… well this is not 100 percent true. Over the last year we had a lot of conversations with Ajax Edition users. Many of them started to use us in their Continuous Integration environments and were talking about additional requirements they had. The wish list always contained the following items:

  • Provide support for distributed tests and not requiring the dynaTrace client for running tests.
  • Allow to track timers within pages and across multiple pages.
  • Provide an easier way to compare test runs down to code level rather than having to compare the copy-and-paste XML export.
  • Automatically tell if tests or certain metrics got worse and send notifications.
  • Automatically detect test case names.
  • Provide a common repository for performance data and sessions.

All these requirements made a lot of sense to us. Together with a couple of others like integration of server-side data they have strongly influenced the move we made towards providing a solution that has all these capabilities built in. This lead to the development of a commercial product that enables more efficient and large scale use of dynaTrace in Ajax environments

Conclusion

Automated performance testing of web applications is not a big hassle when you already have functional tests in place. Using dynaTrace for automatic data capturing will provide data out-of-the-box and repositories like ShowSlow enable you to store and monitor the data over time. With the increasing demand we got in developing an automation solution we see that there is increasing interest in automated performance testing.