Content quality, mobile readiness and site performance are the three key pillars of search engine ranking. I want to examine the performance impact here since it is easy to measure and Google is really supports it. Google is even experimenting with a “Red Slow Label” to warn users about bad performing sites.  But have you, the search engine optimization (SEO) folks, or anyone in your organization, ever looked at what the performance experience of Google –, Paidu –, Bing – and Yandex-bots looks like? In the past we all assumed that our own experience is what the bots experience. But really, can’t we do better in these days of Real User Monitoring? Let me show you how I looked at the problem for our own blog website.

Google Search Results-testing red slow label on search result list
Google Search Results-testing red slow label on search result list

Which bots are crawling and what is their experience?

Our blog is instrumented with Dynatrace User Experience Management (UEM) to obtain full Real User Monitoring visibility. Interestingly, it captures data from both the real visitors and from search engine robots and synthetic robots. The screenshot below shows that 8% of the traffic is coming from bots and >50% from synthetic robots but, for the moment, let’s leave that out (I will cover that in an upcoming blog). For now, let’s focus on the bot experience!

How much traffic is Google traffic vs Real Users traffic?
How much traffic is Google traffic vs Real Users traffic?

Focusing on the search robots segment we learn that the Google bot is the most active bot on our websites, although Bing, Baidu, Yandex bots and others are present.

Which Search Engines are crawling and what is their experience
Which Search Engines are crawling and what is their experience?

While focusing on the Google bot traffic, two things caught my attention. First, most traffic is coming from North America and, second, we have multiple, different bots crawling including the Googlebot-Mobile. The Mobile bot is probably checking the mobile friendliness of your website, so this raises the question — what’s the difference? I will answer this in my follow up blog.

Different Google bots from different locations are crawling your website are they behaving differently?
Different Google bots from different locations are crawling your website are they behaving differently?

Which pages are crawled and what performance does the bot experience for each of them?

Now that we know that we have the bots on our page it is interesting to know which pages the bot is crawling, how often each page was visited, and the response time for each blog article/page. The table below gives me this information, and also provides an indicator whether or not there were errors on those pages.

Pages crawled by the Google Bot with key metrics - Count FailurePages crawled by the Google Bot with key metrics - Count, Failure rate and Response Time rate Response Time
Pages crawled by the Google Bot with key metrics – Count, Failure rate and Response Time

By reordering the table by max response time I now have the pages requiring further investigation from a performance perspective.

Sorted by Response Time or Failure Rate gives you those that you should investigate first
Sorted by Response Time or Failure Rate gives you those that you should investigate first

Conclusion:

Robots in general, and search engine bots in particular, are looking at our websites day and night. If you are trying to improve your SEO ranking like we are, you should definitely focus on improving your website performance. With today’s Real User Monitoring solutions we can measure and validate what the performance is during the crawl, and which pages require immediate attention because they are slower than others.