Report metrics

Alerts sent

The number of Backbone alerts generated within the given time period. Critical alerts are those with a Severe status.

Availability (step)

Also called Step Success Rate. The percentage of steps conducted during a specific period of time that completed successfully. Calculated as:

(Number of Successful Steps) / (Number of Successful Steps + Number of Failed Steps)

Availability (test)

Also called Test Success Rate. The percentage of tests conducted during a specific period of time that completed successfully. Calculated as:

(Number of Successful Tests)/(Number of Successful Tests + Number of Failed Tests)

Average daily allowance (usage)

The average number of units/day one would need to consume to stay within purchased measurements. The value is calculated as plan measurements divided by the number of days in all of the months for the term.

Connect time

A network component, the time (in seconds) that it takes to connect to a web server across a network. This provides an excellent measure of the network round-trip delay due to network traffic.

Connections

The number of TCP links between the IP address received by the DNS lookup and the hosts. Typically, each host can have multiple connections.

Content time

A network component, the time (in seconds) required to receive the content of a page or page component, starting with the receipt of the first content and ending with the last packet received.

Daily available to meet plan (usage)

The average number of units an account can use daily to finish the plan term without overage. This metric is calculated once daily by dividing the measurements remaining at the start of the current day by the number of days remaining in the month (including the current day). This metric is not reported for past months.

DNS time

A network component, the time (in seconds) it takes to translate the host name into the IP address —  often done by a third party. DNS response times that are consistently longer than two seconds typically indicate that one of the DNS servers is not responding.

Domain

Also called the host. A unique name that identifies a website (for example, mywebsite.com).

Failed objects

An object download fails when the agent was unable to download the specified object for one of the following reasons: the agent connected to the server that the object allegedly resides on but was unable to find the object; the agent identified the server but could not connect to it; the agent could not find the server because the DNS lookup failed. A test may be reported as successful even though one or more objects failed.

End of month usage projection

The End of Month Usage Projection is calculated by using the previous day’s usage and projecting that level of usage for the current day and the rest of the month. Usage projections are not reported for past months.

Failed steps

The total number of steps during the time period that did not complete successfully. A step is considered "failed" when any of these conditions exist:

  • Zero objects downloaded with HTTP response status code of 200 (successful)

  • Content match failure

  • Byte limit failure

  • User script failure

  • Timeout exceeded

Failed tests

The total number of tests during the time period that did not complete successfully. A test is considered “failed” when a step within the test fails.

First (1st) byte time

A network component, the time it takes to receive the first byte of the page HTML, graphic object, or other web component after the TCP connection is completed. Overloaded web servers often have a long first byte time.

Host

The URL associated with the web server that hosts content accessed by the tested web page. Host metrics can combine data from multiple IP addresses for the host.

HTTP 200-298 (successful object)

The number of HTTP response status codes from 200 through 298 returned during the selected time frame. These codes indicate the action by the client was received, understood, accepted, and processed successfully.

HTTP 299

The HTTP response status returned by the Browser Agent when the browser aborts an object request. This does not indicate that the object download failed. Instead, it indicates a successful object request was discarded by the browser because of the page behavior at the time the request was aborted.

HTTP 2XX

The number of HTTP response status codes from 200 through 298 (excluding status code 299) returned during the selected time frame. These codes indicate the action by the client was received, understood, accepted, and processed successfully. This number presented is an arithmetic average.

HTTP 300-399

The number of HTTP response status codes from 300 through 399 returned during the selected time frame. These codes indicate that the client must take additional action to complete the request.

HTTP 3XX

This metric is sometimes called 300 Objects. The number of HTTP response status codes from 300 through 399 returned during the selected time frame. This is the (arithmetic) average number per test across all test executions. A status code of 3xx indicates that the client must take additional action to complete the request.

HTTP 400+

The total number of response status codes 400 or higher returned during the selected time frame. These include client, server, network, internal, and timeout errors.

KB

Total number of kilobytes downloaded from the initial request until the last connection closes. The metric in this report is the (arithmetic) average per test across all test executions. Calculated as:

(Bytes Downloaded) / 1024

Location

Where a particular test was run. Depending on the test type, this can be called a site, node, or peer population.

Measurements remaining (usage)

Total number of measurements remaining in the current plan term. This metric is not reported when reporting on past months.

Month to date allowance (usage)

The number of measurements that would be consumed if usage were equal to the Average Daily Allowance on each day.

Month to date usage

The number of measurements consumed during the current month. Measurement units are recorded hourly, so the Month to Date Usage will vary depending upon when the report is run. If usage is higher than the allowance, the usage will be reported in orange text. If usage is higher than the total plan measurements, the usage will be reported in red text and the overage will be reported.

Monthly measurement trend (usage)

The Monthly Measurement Trend tables report usage for past months. If the report is run before the plan end date, this table will include usage, average allowance, overage units (if any) and overage cost for each month in the plan up to 12 months. If the report is run outside of the plan term start and end dates, a maximum of 12 months will be reported, but there will be no allowance or overage metrics.

Network component metrics

Metrics that provide visibility into how the network time is spent in order to provide more information about the nature of time needed to download a web page. Times for each component are calculated in the following manner: every individual test execution sums the total time for a particular component within that test execution, inclusive of all objects and connections. An average of those sums is then charted over the time breakdown requested. Network Components include DNS Time, Connect Time, SSL Time, First Byte Time (1st Byte Time), and Content Time.

Objects

An object is a single downloaded file such as HTML page source, a GIF image, a Java application, or a response status code header.

In reports, the Total Objects metric comprises Successful Objects (response codes 200-298), Failed Objects (response codes 400 to 20099), and objects that were partially downloaded but the test initiated the next navigation before the download was complete (response code 299). This metric is the (arithmetic) average of the count in successful test executions during the time period. Object counts from failed steps are not included in the averages.

In interactive charts, the Total Objects metric comprises Successful Objects (response codes 200-298), 300 Objects, Failed Objects (response codes 400 to 20099), and objects that were partially downloaded but the test initiated the next navigation before the download was complete (response code 299). This metric is the (arithmetic) average of the count in successful test executions during the time period.

Occurrence

Percent of total successful steps that accessed content from this host.

Overage cost/unit (usage)

Cost of each additional overage unit. This may be calculated with a higher decimal precision that displayed on the report.

Overage units (usage)

The number of measurement units consumed more than allowed measurements for the term length.

Page composition metrics

A set of metrics that indicate the complexity of a given web page. These include Hosts, Connections, Objects, and KB downloaded.

Plan end (usage)

Contract end date as recorded in the Usage system. Often this is reported as of the end of a month.

Plan measurements (usage)

Total number of measurements that can be consumed over the specified term length. Plan measurements are comprised of base measurements plus promotional measurements.

Plan start (usage)

Contract start date as recorded in the Usage system. Often this is reported as of the start of a month.

Report time frame (usage report)

All usage time references are reported in Universal Coordinated Time (UTC).

Response time

For full-object tests, and for Firefox and Chrome no-object tests: The time, as measured in seconds, from when a user clicks on a link to when the content is completely downloaded. This includes the time to collect all objects on all steps of the test, including graphics, frames, third party content form offsite servers, and redirection.

For Internet Explorer no-object tests: the time, as measured in seconds, from when a user clicks on the link to when the root object is downloaded.

Response time average

The arithmetic mean for all successful tests or steps in the selected time period.

Response time distribution

The percentage of test executions that have response times within the given response time frames.

Response time (domain/host)

The time, as measured in seconds, to download all of the objects from this location.

Response time maximum

The longest response time for all successful tests in the selected time period.

Response time median

The middle value for response time for all successful tests/steps in the selected time period. If there is an even number of test executions, it is the average of the two middle numbers.

Response time minimum

The shortest response time for all successful tests in the selected time period.

Response time (percentile)

The average response time below which X% of the response time measurements can be found.

Service level thresholds

Values entered by the user or calculated automatically by the report, used to categorize performance metrics as Good, Warning, or Severe.

SSL time

A network component, the time (in seconds) it takes a client to send a request to connect to the server, the server to send the signed certificate, and the client to make a handshake with the server. When the machines that provide SSL termination at your website are overloaded, SSL times will increase.

Successful steps

Also called Valid Steps. The number of steps that did not fail during the selected time period.

Successful tests

Also called Valid Tests. The number of tests that did not fail in the selected time period.

Term length (usage)

Number of months over which purchased measurements can be used.

Term to date allowance (usage)

The number of measurement units consumed more than allowed measurements for the plan term.

Term to date usage

The number of measurements consumed during the current term. Measurement units are recorded hourly, so usage for the current month will vary depending upon when the report is run. If usage is higher than the allowance, the usage will be reported in orange text. If usage is higher than plan measurements, the usage will be reported in red text and the overage will be reported.

Test status

Whether the test is currently active or inactive.

Test type

The kind of testing location. Options are Backbone nodes, Mobile nodes, and Last Mile or Private Last Mile peers.

Throughput

Rate of content delivery from a given host. Calculated as:

(kilobytes delivered) / (response time)

Total steps

The number of steps executed in the selected time period.

Total tests

The number of tests executed in the selected time period. Total Tests is equal to Valid Tests plus Failed Tests.

Type

The kind of synthetic test being run: Backbone, Mobile, Last Mile (LM), or Private Last Mile (PLM).

Valid steps

The number of steps that completed successfully during the selected time period.

Valid tests

The number of tests that completed successfully in the selected time period.

XF usage

The XF measurements used by the account for the reported item during the reporting period. For example, XF usage in the Test usage report is the XF measurements used by each test in the account that was executed during the 24-hour period before the test was generated.