The Operational Summary gives you insight into your web application’s performance: average availability, average response time, number of pages in the measurement, and number of failed tests out of the total test runs in a given time period.
Each test is identified by the product and browser:
- Transaction Perspective
- Application Perspective
- Mobile Web Perspective
- Internet Explorer
- Mobile (MWP only)
You can sort the table by clicking a column head.
Performance status at a glance
Color-coded status icons let you quickly identify the tests that are having performance issues and their severity:
Default thresholds for determining status are:
- Good : ≥ 98%
- Warning : ≥ 95% and < 98%
- Severe : < 95%
- Good : ≤ 4000ms
- Warning : > 4000ms and ≤ 7000ms
- Severe : > 7000 ms
Drill down from the Operational Summary view to analyze a test that shows performance issues.
Expand a row to display a chart of the test’s response time and availability for the Operational Summary’s time range.
Hover over a data point to see its timestamp and performance values.
Click Response time or Availability in the chart legend to remove that metric from the chart.
Click View in charts to view the same data in a Performance Trend Graph for the same time period.
Clicking the Errors number for a test in the Operational Summary opens up Error Analysis view, filtered to show errors for just that test.
Error analysis gives you a granular, interactive view of error types and counts, and error distribution over agent locations.
Refining the data display
The default view lists all the tests for your account, and displays performance data for the selected time range.
Select the time range
Click the icon and select a new custom or preset time range.
Preset ranges are relative to the current time and vary from the last 1 hour to the last year, with additional options for the current calendar day (Today) or previous calendar day (Yesterday).
You can specify a custom relative time range, i.e., the number of hours, minutes, or days to look back from the current time or from a specified number of hours/minutes/days prior to the current time.
For example, you configure the relative time range as follows:
- Time range – 6 Hrs
- Ending – Custom
- Shift to past – 30 Min
At 12:40, the dashboard displays data from 6:10 through 12:10.
You can also specify absolute start and end dates and times in five-minute intervals.
The custom time you select is your local time. However, all times in the portal are displayed in the time zone configured for your account. For example, if your machine is in the Pacific Time zone but the account time zone is Eastern Time (three hours ahead), when you select 14:00 and 16:00 as the custom range, the displayed time range is 17:00 through 19:00.
When you change the time range of the Operational summary, it is applied to the Error analysis as well. The time range selected when you log out will be applied the next time you log in.
Filter the test list
Click the filter icon at the top left of the view to display the filter pane.
You can filter by any combination of criteria:
- Test name—Type a string in the Filter tests field at the top of the pane to list only the tests that have that string in their names.
- Status—Click one or more severity levels to display only tests with that status. For example, you can display only Severe tests, or all Warning and Severe tests. See Performance Status at a Glance above for an explanation of status thresholds.
- Product—Click to display TxP, MWP, ApP, or any combination of these service types.
- Browser—Click one or more browser types. (The Mobile browser type is for MWP tests only.)
To remove a filter, clear the Filter tests field or click to deselect a selected filter.
Sharing your Operational summary view
Click the share icon at the top-right corner to display a URL that you can share with other logged in users. This URL includes parameters for the time range you’ve selected.