A while ago I wrote about measuring user experience or customer experience by applying the User Experience Index as the metric to monitor in today’s digital performance world. To obtain the right insight we must consider four key ingredients:
- Response Times
- User behavior
- User environment
The causes of response time errors are usually straightforward. People have different expectations of response times, and where many initially feel two seconds for a page load is good, anything beyond eight seconds creates frustration for the user. But if we examine how new web applications are built, and how things have changed with the digital transformation, this is no longer true. Let’s take a look at the following two scenarios.
Scenario A: The “Generate Report” Button
Which response time will satisfy someone who clicks on a button that generates a report containing a list of 10,000 customers with their monthly revenue from the past two years? It’s unlikely this person is expecting a response time of two seconds or ten seconds. Perhaps they are satisfied waiting as long as 30 seconds?
Scenario B: The Search Suggestions
When typing a word or term into a search field, after the first few letters people stop and wait for the search suggestion/auto-complete box to appear. What do they consider to be an acceptable wait time? How long do people wait before they continue typing? Two seconds? In this instance, people expect a response time of 100 ms or less, and become very frustrated if the suggestions only appear after two seconds or more.
In general we can define four performance categories:
- Instant (0.1-0.2 sec)
- Immediate (0.5-1sec)
- User Flow (2-5 sec)
- Attention Span (5-10 sec)
Now let’s focus on the how this impacts our performance monitoring. For those of you who wish to take a deeper dive on perceived performance I recommend this article series by Denys Mishunov.
User Experience Index upgraded for better Customer Experience Management!
Knowing this, do we run into the issue of needing to define a different threshold for each user action /page load/navigation? In environments with thousands of different user interactions this is not practical. Within Dynatrace User Experience Management we address this by using User Action Groups as well as Key Action Monitoring.
What are User Action Groups?
User action groups are basically “buckets” for similar User Actions/Web pages. In this case “similar” refers to the response time a user expects. This allows us to say for the “Generate Report” case that we expect a response time of 30 seconds and consider this to be a good experience. For the “Search Suggestion” case, we can define a threshold as low as 100 ms and, for the product detail pages, we maintain the two second threshold. These defined thresholds create the basis for the calculation of the user experience of each visitor, and allow greater accuracy in determining whether someone has a satisfying or frustrating digital experience.
How else can we leverage User Action Groups?
We use Dynatrace UEM to monitor our Word Press Instance that hosts this blog. I came up with a Best Practice to define a User Actions Group per publishing year of blog post this segmentation allows me to easily determine how much traffic is driven by current blog posts vs blog posts from previous years.
An eCommerce platform could separate their monitoring by product category, and observe if people have different experiences depending on the categories, or determine which product categories are requested more often. Overall, it offers new options for focused observation and reporting.