The Super Bowl Effect on Website Performance

Whether you are a Fan of US Football or not – it was really hard to avoid this huge sport event on February 5th. Besides the actual game it is the Super Bowl commercials that – besides being very expensive to air – usually drive a lot of load on the websites of the companies that run their ads. The question is whether the millions of dollars spent really drive consumers to these websites and make them do business with them.

As we won’t get an answer from the top brands that advertised about the actual conversion rates we can look at End User Experience and Web Site Performance of their web sites while these Ads were aired. By analyzing this data that we can get through continuous synthetic monitoring combined with deep dive browser diagnostics we will be able to see whether their web application was actually able to handle the load and didn’t leave too many of these users with a bad user experience.

Use Case: What was the User Experience of One of the Top Brands during Super Bowl Day

In order to avoid any discussions on whether the numbers we present are good or bad and mean that this company did a good or terrible job we present all this data without giving an actual name as the purpose of this exercise is to show how to perform this type of analysis.

Synthetic Monitoring of a Web Site

In order to monitor web site performance of a site we setup a Gomez Synthetic Monitor to be executed on a scheduled basis. The monitor not only monitored the performance of the initial home page but also walked through several main use case scenarios on that page, e.g: searched for a product. These tests were also executed from multiple different US locations, using Low and High-Bandwidth connections and also using a mixture between Internet Explorer and Firefox browser. Additionally we captured individual performance sessions using dynaTrace Browser Diagnostics.

The following screenshot shows the response time of the Home Page of this web site via our Last Mile platform via a Firefox browser in the days before the Super Bowl as well as during the Super Bowl. The average Page Load time was around 9 seconds with the exception of the timeframe during the Super Bowl – that’s when it jumped up to 30 seconds:

Synthetic Monitoring shows peak during the SuperBowl of up to 30s to load the Homepage
Synthetic Monitoring shows peak during the Super Bowl of up to 30s to load the Homepage

Analyzing Page Load Time from Monitoring Data

There were two factors that drove this spike:

  1. Higher Load on their application due to the commercial aired during Super Bowl
  2. Additional Content on their page that doubled the page size

Factor 1: Higher Load

That’s of course intended and a success for their marketing campaign. What you want to make sure is that you can handle the additional expected load by using CDNs (Content Delivery Networks) to deliver that static content to your end users as well as provide additional resources on your Web and Application Servers to handle the extra load. To be prepared for that it is highly recommended to do some up-front large scale load-testing. For more information on that you can read my recent blog posts on To Load Test or Not to Load Test: That is not the Question.

Factor 2: Additional Content

The average page size of the Home Page jumped from 776kb to 1.497kb – that’s double the page size. The reason for this is additional images and content that was displayed on the Home Page when you accessed it during the Super Bowl. The Gomez Monitor as well as the Dynatrace Browser Diagnostics Sessions provides detailed resource analysis which immediately highlight the additional images, style sheets and JavaScript files. The following shows some of the additional Super Bowl related content including size and download time:

Additional content downloaded during the Superbowl resulting in up to 3 times highe page load times
Additional content downloaded during the Super Bowl resulting in up to 3 times higher page load times

The additional content in this case may have been necessary to fulfill the company’s marketing campaign. The question is whether the additional content could have been optimized to not download 40 additional resources with a total size of > 700kb.

Deep Dive Browser Diagnostics

Looking at the Dynatrace Browser Diagnostics Data we can observe several interesting aspects.

Observation 1: Load Impact of additional resources

We can see the actual impact of these additional resources that got downloaded during the time of the Super Bowl. Two of these resources took a very long time and with that had a major impact on page load time. The background image was delivered by their own web server and was not put on a CDN which results in more traffic on their web servers and not optimal performance for end users that are far away from their data centers. Another interesting aspect was an additional Facebook Like button that took 600ms to download and execute.

Dynatrace Timeline View showing the impact of additional resources on the page load time
Dynatrace Timeline View showing the impact of additional resources on the page load time

Observation 2: Page Transition Impact of CDN User Tracking Logic

Some tracking tools out there send their data in the onbeforeunload event handler. Modern browsers actually don’t allow the onbeforeunload event handlers to take too long or send any data. The workaround for this therefore is to put in an “artificial” JavaScript loop that waits for 750ms to ensure the browser sends the AJAX Request with the tracking data before the browser navigates to the next page. We can also see this behavior on this page:

JavaScript Tracing Code from a CDN Provider adding a 750ms loop to ensure tracing data gets send before navigating to the next page
JavaScript Tracing Code from a CDN Provider adding a 750ms loop to ensure tracing data gets send before navigating to the next page


When you are responsible for a website that is going to see a high user volume for a marketing campaign you want to

  1. Make sure the additional content for that marketing campaign is optimized, e.g: make sure this content is on CDNs and follow the Best Practices for Web Performance Optimization
  2. Make sure you test your application with the campaign specific features under realistic load
  3. Analyze the impact of 3rd party tracking tools and other widgets you put on your page

Andreas Grabner has 20+ years of experience as a software developer, tester and architect and is an advocate for high-performing cloud scale applications. He is a regular contributor to the DevOps community, a frequent speaker at technology conferences and regularly publishes articles on You can follow him on Twitter: @grabnerandi