Managing search bot experience for improved SEO

We all know that appearing top of the search engine results has an important business impact. But fewer know which techniques are specifically used for Search Engine Optimization (SEO). Google algorithm is a well-kept secret that, if made public, would immediately be optimized against.

There is a plethora of documentation on the importance of keywords and links management. But less is found on the third pillar: performance as a measure of user experience.

Let’s have a look at Google moves to foster user experience and performance.

Now let’s have a look at Google efforts on another CX dimension, the performance.

  • Early 2010, Google announced that website speed would have an impact on search ranking
  • Early 2015, Google was spotted testing for performance


Do you also see a pattern?

A question I often get during those discussions is “why Google would increase performance weight in the search rankings?” My short answer: as for mobile friendliness, to help users get a better user experience by navigating higher performing web site!

We’ve all seen such graphs where conversion decreases as users have to wait longer, right?


In essence, Google bots don’t really behave like real users, but they try to capture and assess what will be a good experience for all of us.

The next question then is, “can I measure google bots experiences on my web sites?” Most analytics tools filter out bots activity since they’re not real users transacting. Dynatrace captures and keeps it all.


You can see which bots hits your systems, when, and crawling which pages. Armed with those analytics you can now focus your SEO optimization efforts instead of a carpet bombing approach too often taken.