Saturday, July 27, 2024
No menu items!

Response Time in Google Search Console

Must Read

Average Response Time in GSC

What is the TTLB – optimal time to deliver pages from our site to Googlebot? The graph with the crawl statistics has been present in Google Search Console – GSC for years, but Google has never “officially” declared an optimal value or an ideal range .

Let’s start by defining the Average Response Time value that we find in Google Search Console :

Average page response time for a crawl request to retrieve the page content. Does not include retrieving page resources (scripts, images, and other linked or embedded content) or page rendering time.

Average page response time for a crawl request to retrieve the page content. It does not include the retrieval of page assets (scripts, images, and other linked or embedded content) or page rendering time.

Now that we know what it is, let’s look at its SEO implications .

Matt Cutts wrote in April 2010 that page loading speed is taken into account in the ranking, but how we don’t know. We only have a few Hangouts with vague answers:

We do say we have a small factor in there for pages that are really slow to load where we take that into account. John Mueller, GOOGLE

Let’s say we have a small factor there for pages that are very slow to load where we take it into account.

Google might crawl your site slower if you have a slow site. And that’s bad – especially if you are adding new content or making changes to it.
We’re seeing an extremely high response-time for requests made to your site (at times, over 2 seconds to fetch a single URL). This has resulted in us severely limiting the number of URLs we’ll crawl from your site. John Mueller, GOOGLE

Google may crawl your site slower if you have a slow site. And that’s bad, especially if you’re adding new content or making changes.
We are experiencing an extremely high response time for requests made to your site (sometimes, over 2 seconds to retrieve a single URL). This has led us to severely limit the number of URLs we will crawl from your site.

It should be mentioned that it is not yet clear how much the average web server response time directly impacts the ranking in the search results . There are websites on extremely slow servers but which continue to rank well due to their authority, age and high level of information provided.

There are also websites with very fast servers that struggle to position themselves due to a lack of authority and quality content. As we know, there are so many variables involved and assigning a direct link to a single SEO factor is almost impossible.

Either way, whether it’s a direct or indirect ranking factor, the main goal should always be to deliver pages as quickly as possible to Googlebot, and of course to your users .

The TTLB (Time To Last Byte) is a relative value, for one site a time reported in GSC of 1200ms could be fine, while this same value would be disastrous for another website.

Therefore? What is a good value? How is it calculated?

You will have noticed that different tools give different results based on the location of their server and based on the allocated bandwidth According to CTN News. There are more or less useful tools: Have you ever seen a high PageSpeed ​​score on pages over 20MB? I happen to find some …

As a reference tool to monitor the average response time, I recommend using Google Search Console, it’s free and the data comes from the “source”.

In case you haven’t already done so, I suggest you also run a check with http://www.webpagetest.org/ using a 1.5mbit bandwidth. Given that in Italy the average is 2-4mbit, it makes no sense to go further. So before running a test look for the average connection speed of your users, and then run the test with those settings. Testing with 20 mbit is unrealistic unless you do business in Sweden.

How to adjust the band to use for the test

Also, check from all the servers available in your country and possibly also from outside, it is possible that the time marked in the tool is not representative of your users.

What to reduce web server response times

The first two aspects to consider are:

  • Server Byte Speed. The time it takes for the server to say, Hi I have received your request! Here is the information. On shared hosting, on average, the time will be around 800ms-1000 ms on the Byte Server, while on VPS and dedicated hosting the times that can be reached are lower.
  • The amount of requests on the page. This means that if the page uses 20 images and 20 java-scripts then it will generate 40 requests. Ultimately reducing the number will reduce the time to render.

To reduce the response times of the web server we must work with a view to lightening its workload. Let’s see some solutions.

CDN

Personally I am convinced that to optimize the site the best way is to proceed using a Cookieless CDN network with Cache-control expiration times optimized for static files, such as images and JavaScript. In most efficient CDN networks the response times will be within 70ms and do not suffer from multiple requests.

Using a CDN network like Cloudflare even with shared hosting is advisable not only for this reason: many people don’t know that a CDN network can only cost cents for 1GB transfer, paying as you go is great.

Server-side cache

A cache layer also helps to download weight from the web server and toachieve lower response times . A server side cache system, such as FastCGI (in use on this server) is more efficient than an application such as a WordPress plugin, since FastCGI works at a higher level in the machine.

The cache does not work only at the level of HTML pages, there are web server applications that cache PHP scripts like OPCACHE , or store MYSQL responses like Memcached . Do you use them? Have you tested them?

Fast database

I can’t overlook the importance of a good database system: the quicker the database is to respond to queries, the faster WordPress and PHP can generate pages. Spend some time optimizing your MySQL server, the results you can get compared to a standard installation will blow your mind.

Having good Server Byte time and fewer requests is all you should focus on. These speed test tools show a lot of parameters, but these two are the most important. The best tools are those that allow you to emulate multiple requests, from different servers with different speeds.

There is a lot of debate about how much longer response times can impact Google ranking, however there is no way to say with certainty the relationship between speed and ranking. Many of the best known and most famous sites operate between 1 and 2 seconds, but what does that mean? There are sites that have times of 1 second with 100 users connected which become 5 seconds with 20,000 users connected. There are also sites that have times of 1 second even with 50,000 users connected at the same time. Response times are relative, they must also be tested based on the web server load.

Verify individual pages with Google Search Console

There are several tools that Google makes available to check the speed of our pages and consequently improve them, such as PageSpeed ​​Insights and Lighthouse.

Google Search Console also provides a report dedicated to crawl statistics as we saw at the beginning of the article.

Crawl Statistics – Google Search Console

Open the report and select only the requests to HTML pages, now check the trend of the average response time.

In my SEO audits I use a scale to evaluate the average time:

  • > 1000ms: very bad
  • <800ms: insufficient
  • <600ms: sufficient
  • <400ms: good
  • <200ms: excellent
  • <100ms: excellent

Do not take these values ​​as absolute , there are many variables to consider, such as the number of pages on the website and its authority.

Let’s focus on the first point: the number of pages. Let’s say your site’s crawl budget is 60 seconds and you have published 60 pages. With 1000ms response time Googlebot could fully crawl your site every day, which would be great.

It would be different for a site with 600 pages. With the same crawl-budget it should have response times of less than 100ms in order to be scanned completely every day.

When the average response times go down in the scan statistics graph, what happens?

  • Does Googlebot crawl more pages? Then it means that it would be convenient for him to receive the pages more quickly. You may want to consider interventions to reduce the response time.
  • The average time goes down but Googlebot does not take the opportunity to crawl more pages? Then it means that the current pace is enough for him and you would not benefit from reducing the average response time.

Use the GSC report, learn to read it and you will be able to understand if the response time of your server is sufficient for your site or not.

Was this guide useful? Leave a comment! What are the response times of your site and what technical setup do you use?

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest News

Internship and Job Opportunities for Students Getting a Degree in Malaysia

In today's competitive job market, gaining practical experience through internships is crucial for students preparing to enter the workforce....

More Articles Like This