Benchmarking

MyKeynote’s Benchmarking dashboard is home to Digital Performance Intelligence (DPI), Dynatrace’s performance analytics product enabling in-depth and multi-dimensional analysis of digital (mobile, desktop, or tablet web) performance from the end-user’s perspective for digital business owners.

DPI’s Competitive and Third-Party Competitive Benchmarking dashboard allows you to see how your site compares with your top competitors, with a complete breakdown of performance data into various first- and third-party components, plus benchmarking against Keynote industry-wide indices.

The information on this page is also the subject of a brief technical talk on DPI.

Accessing benchmarking

When you first access >** Benchmarking** , you are presented with a list of measurements from which you can select up to four to compare, including your own. All scripts, including competitor scripts, are defined by the DPI customer and then provisioned by Dynatrace for performance data collection.

Use the text filter to enter a search string for a measurement and select it from the list presented. Or you can expand the Dynatrace Keynote Indices or My Scripts sections to see the listed measurements and select them. When done selecting measurements, click OK ** to view the dashboard or **Reset to start over.

Main features

When you land on the default Benchmarking dashboard, you can instantly see the graphical and numerical representation of your average performance in comparison to that of your competitors. The default time period is the last 24 hours.

The main features represented by the various Benchmarking dashboard widgets are:

  • Overall average performance comparison between you and your competitors for the time period chosen (Measurement Performance graph)
  • Performance over time (UX Time Trend graph)
  • Availability over time (Availability Trend graph)
  • The ability to compare competitors by regions of the world (select a region in the World Map widget to compare overall performance, performance and availability trending, component breakdown and performance)  
  • Comparison of third-party services by category (Third Party Component Performance graph)
  • Comparison of first-party content categories and performance sub-components (Page Component Performance graph)
  • The ability to see performance details for each first- or third-party category:
    • If you select a third-party service category, you can see:
      • Performance over time for just that category (Individual Component UX Time graph)
      • Average performance for the individual vendors in that category (Individual Component Average Performance graph)
    • If you select a first-party component, you can see:
      • Performance over time for just that component (Individual Component UX Time graph)
      • Average performance for that component in primary and third-party content (Individual Component Average Performance graph)
  • Performance over time for an individual third-party vendor or page component (Individual Component Performance Trend widget)

You can add up to four additional (five total) dashboardsto facilitate comparison, e.g., for different sets of measurements or different time periods for the same measurements.

Additionally, there are filters to change the measurements and time period chosen, among others. Filters are discussed below.

Using dashboard graphs and data

The sections that follow detail how to use the Benchmarking dashboard graph widgets individually and together to compare your performance to your competitors’ effectively.

Overall performance

At the top of your dashboard, you will see figures for average performance times of the selected measurements over the time period chosen (the default is the last day). The Measurement Performance bar chart at top left breaks that information down by device type. By default, the performance average is based on data from all monitoring agents.

Hover over any of the bars in the bar chart to see the associated performance (user experience) time.

For any given time period and/or geographical region, you can see performance and availability trends—performance and availability values for individual data points over time.

The images below show performance times in seconds and availability computations (where success = 100 and failure = 0) for the hours 11 a.m. – 4 p.m. over a single day.

Some data points might be aggregated, showing availability values other than 0 or 100.

 

Click any measurement name in the legend to remove it from the graph; the deleted measurement is grayed out in the legend. Hover over any data point to see the individual timestamp and UX time.

By geographical region

By default, monitoring agents in all regions of the globe are used to compute performance data. When you click on a region in the World Map widget, the performance values recalculated based on agents in that region. The colors on the World Map widget represent the fastest performing measurement (of all selected) for a given region. When you hover over a region, the performance times of the fastest measurement and other measurements are displayed. Note that all measurements might not be monitored in the region you choose.

Click Back to World View to return to the world map and reset values.

For the United States alone, you can drill down further into specific states to view comparative performance there. Click Back to Country View to return to the country map.

Performance by component

The Benchmarking dashboard offers competitive insights at the third-party and page component levels, allowing you to see how third-party vendor performance and your own page architecture affect performance.

Third-party performance

Dynatrace maintains a database of third-party services and shows their average performance by major categories in the Third-Party Component Performance graph. Not only can you see the categories of services used by your competitor (not all your competitors use all categories of third-party services), you can also see comparative performance for each category. Hover over any bar to see the average performance (UX) time for the time period and region chosen.

When you click a third-party category, others are grayed out, and performance over time of all vendors in that category is shown in the Individual Component UX Time graph. The image below shows average and performance over time for social media. The trend chart clearly shows consistently slower performance for the Daily Mail. 

Viewing individual vendors in the social media category can help explain if only a few or many accounted for the differences in performance. Individual vendors (including consolidators and aggregators) are displayed in the Individual Component Average Performance graph. You can see how a particular vendor is performing as compared to your competitors. The image below shows social media vendors and a marked difference in Facebook average performance across different measurements.

Select a vendor and check their performance over time in the Individual Component Performance Trend graph. If a vendor has consistently higher performance times on your website than for your competitors, is this consistently high or does it consist of performance spikes that have contributed to a higher average performance time on your website? Could this vendor have contributed to the high performance of the category as a whole?

The image below shows the performance trend for Facebook, with a consistently slower performance for The Mirror. 

Page components

Page content categories and performance sub-components are shown Page Component Performance graph. The average performance is listed for the following categories:

  • CSS time—the time taken to download CSS files
  • JS time—the time taken to download Java
  • Text time—the time taken to download text files
  • Image time—the time taken to download image files
  • Request time—network time for all HTTP requests (for text, image, JavaScript, CSS, and other element types) added together. Note that this is not the delta between the start of first element download to end of last element; it is the sum of each element’s network time.
  • Redirect time—that part of request time spent in redirection (as opposed to successful download)
  • Element time—request time minus redirect time
  • Domain time—the time spent downloading elements from one domain or other; same as request time, but note that domain count will not be the same as request count (which is usually higher)

When you click a page component, others are grayed out, and its performance over time is shown in the Individual Performance UX time graph. The image below shows average and performance over time for text elements. The average text performance time for The Mirror is way above its competitors’.

When viewing a breakdown of text components by first- and third-party in the Individual Component Average Performance graph, you can see clearly that third-party text content is way slower than primary domain text content for every measurement except The Sun.

When you view third-party text performance over time in the Individual Component Performance Trend graph, you can see that third-party text is consistently and significantly slower for The Mirror than for its competitors, possibly explaining the spike in average text performance time for The Mirror.

Other ways to view component performance

UX time is not the only way to view component performance. A set of filters at the top of the dashboard allows you to view third-party or page component count, error count, or element size.

If you choose to view component Count for the same measurements discussed above, you will see that The Mirror’s total text element count, third-party text count, and primary domain text count are all higher its competitors’.

Similarly, an examination of third-party element count shows that The Mirror downloads fewer social media elements but more Facebook resources than its competitors. Facebook performs slower for The Mirror than its competitors (see Third-Party Performance above).

Comparing to Keynote indices

Checking the Show Indices box at the top right adds data for the top 10 performing Keynote indices data to all third-party and page component graphs. Black lines representing data for all websites monitored by Keynote indices is added to the third-party and page category graphs.

To compare to Keynote indices, you must:

  • Display at least a week’s worth of data.
  • Opt to view Overall performance; do not chooseTime to First Paint or Time to Interactive .
  • View entire measurement data; do not view page-level data by page name or page number.

At a glance you can see how various components perform for you and your competitors versus the top 10 performing Keynote indices. When you click a specific third-party or page category, Keynote index data is also shown for the category UX trend graph and the category breakdown graph. In the image below, social media performs fairly slowly for Keynote indices in comparison to the measurements chosen. Performance over time shows very high periodic spikes in Keynote indices’ performance, possibly accounting for their slow average performance. Among vendors, Keynote indices are comparable to other measurements except in the case of Twitter and AddThis, which are faster for the news outlets chosen, an observation borne out by the UX trend charts for Facebook and Twitter. The spikes in Facebook performance in Keynote indices probably account for the spikes in the overall performance trend.

Graph tools

Dashboards

You can add up to four additional (five total) dashboards to facilitate various comparisons, e.g., different sets of measurements or different time periods for the same measurements. 

When you first navigate to Analytics > Benchmarking , you land on Dashboard 1. Use the “+” sign at top right to add more dashboards. 

Double-click a dashboard name to edit it.

Filters

Controls for filtering data for some or all dashboard graphs are displayed at the top. For more screen space, you can hide or show the filter lists. Make your filter selection(s) and click Submit to see the resulting graphs.

  • Measurement selector – Type into the selection box to view a list of measurements that match your string.

 

  • Page name selector – If your transactions contain several pages, you can view page-level data for the measurements chosen. Note that only measurements containing the selected page name will show up for comparison in graphs below. Be aware when comparing by page that even if you are monitoring the same business transaction across several competitor sites, they might have different page structures.

 

  • Page number selector – When monitoring multi-page transactions, you can compare page-level data by page number. While this filter is useful when measurement pages are not named, be aware again that competitor measurements that track the same business transaction might not contain the same number of pages; page 3 of one script might correspond to page 4 of another. As a best practice, we recommend naming pages and filtering by page name so you can be sure of comparing like data.

 

  • Browser event selector – Use this filter to view performance at significant browser events:
    • Before Time to First Paint – when page elements are first displayed
    • Before Time to Interactive Page – when the page becomes fully interactive for the user

Use these controls to see how sites are constructed and when different types of elements are loaded. Even though your competitor might have a slower overall performance than you, their time to first paint might be higher, leading to the impression of better performance. Perhaps all you need to do to improve your performance as compared to your competitors’ is to rearrange how some page components are being loaded.

  • Location selector – Choose a country to view data for. You can achieve the same effect by clicking a location in the World Map widget.

 

  • Component performance criteria – Use this selector to view third-party or page component performance by time, count, error count, or element size (see Other Ways to View Component Performance above). This filter applies to component graphs only:
    • Third Party Component Performance graph
    • Page Component Performance graph
    • Individual Component UX Time graph
    • Individual Component Average Performance graph
    • Individual Component Performance Trend graph

  • Time range selector – Select from preset time ranges or define your own. Applies to all graphs.

 

  • Bucket selector – Select the unit of time by which data should be aggregated. 

 

Each graph has controls at the top right that allows you to print or download the image in various formats.