I am currently working on a blog where I am going to explain how to do cross-browser testing and performance analysis using dynaTrace. Before I am going to blog that How-To I want to share with you one thing that I have noticed when executing my tests in Internet Explorer 8 and Firefox 3.6. Test execution times are very different – but – this is not because one browser is slower than the other. It is because Selenium has different synchronization mechanisms in IE and FF to e.g: wait for a page to be loaded.

My Environment

I run a simple test on the latest 2.0 Beta build of Selenium. The test executes the following steps:

  • Opens the homepage of my test application
  • Enters username and password and hits the Submit Button
  • Then clicks through 3 different pages by clicking on standard links

Execution Times over Time

I executed this test multiple times. In every test iteration I first execute the test on Internet Explorer 8 and then on Firefox 3.6. I use dynaTrace AJAX Premium Extensions to track performance on all these test executions. Looking at the Test Automation View allows me to compare different metrics, over time and also across the two browsers. The following screenshot shows the execution time of the test scenario of the last 6 test iteration in both IE and FF. The green line represents IE with an execution time that is about 1.5s slower than FF.

Execution Time of this test is about 1.5 seconds slower on Internet Explorer than Firefox - but - it is not because IE is slower
Execution Time of this test is about 1.5 seconds slower on Internet Explorer than Firefox – but – it is not because IE is slower

So – looking at the execution time alone would let me assume that IE is just so much slower than Firefox. But – that is actually not the case.

Detailed Timeline Analysis

dynaTrace allows me to compare two tests. I can compare Network Downloads, Rendering Activity, JavaScript and DOM Executions or any Server-Side activity that these tests executed. The first view that I get to see is the Timeline Comparison. When we compare the test timeline of both tests we see this big gap in Internet Explorer where nothing really happens:

We can compare the test executed in IE with the one in FF side-by-side discovering this big time gap where Selenium just waits
We can compare the test executed in IE with the one in FF side-by-side discovering this big time gap where Selenium just waits

What this really tells us is that looking at execution times of tests is not good idea when we want to compare performance of tests across browsers. It seems Selenium has different ways to wait for certain events to happen. Depending on the browser this can have a significant impact on execution time.

Does this mean we can’t do performance tests with Selenium?

No – we can do performance tests – but – we need to look at different metrics. The dynaTrace AJAX Edition calculates a lot of KPIs such as Time Spent in JavaScript, Time Spent in Network Downloads, Number of Resources, Size of Resources, … – these are the metrics that are more interesting for us. With dynaTrace AJAX Premium I can also compare these results between browsers for my test. This is now showing me a much better picture – in fact – it seems the JavaScript code runs faster in IE than in FF:

It is important to compare other metrics than just overall test execution times when you want to compare cross browser performance
It is important to compare other metrics than just overall test execution times when you want to compare cross browser performance

Watch out for my How-To blog. In the meantime you can make yourself familiar with the free dynaTrace AJAX and the Premium Extensions.