Post by cocolipid on Feb 28, 2024 5:40:39 GMT
Here's what you need to know about the crawl statistics page of Google's Search Console What we see in Google's Search Console are statistics and measurable data. Why are they important? If we cannot measure the performance of our website in a quantifiable way, it becomes difficult to have benchmarks that indicate the success and/or failure of our business. If you are new to the world of entrepreneurship or running a small business, all this data might confuse you, but you should know that there are some metrics that are very important and could help you in growing your business. Our SEO agency in Milan believes that Google Search Console crawl statistics are really important, especially now that the tool has been updated. What are Google Search Console crawl statistics? To put it simply, Google Search Console crawl statistics show the activity of Google Bots that have inspected your website in the last 90 days.
This process is obviously fundamental to evaluate the type of information Peru Phone Number that Google collects from your site, such as CSS, JavaScript, Flash, PDF or image files. It should be remembered that Google Analytics cannot offer this same information, so it is important to also analyze search console data. You can see the statistics that the Google bot has collected in the Search Console. In the side menu you can examine the entire section. Under the Scan subheading, you will find the words “ Scan Statistics ”. Generally speaking, there is no "ideal" indexation percentage: you must aim for consistency . Of course, there are some cases where certain peaks can be seen (for example when adding a lot of content to the site). What happens when the statistics push up? In many cases – as we said previously – a spike in Google crawling could only indicate a sudden change in the amount of content present on the site. Unfortunately, an increase in the presence of Googlebots is not always a positive thing, as it could affect the speed and functionality of the site itself. If you just added a lot of new content you don't have to worry.
That said, there are some cases where the peak will not drop as quickly as you would like. When this happens you need to pay attention to these points: Make sure it is actually Google that is crawling your site and not another user; If you need to stop the scan quickly, you can enable HTTP code 503 for all requests; If you see this spike often, you can set a “ preferred maximum crawl rate ” to ensure your website is never overloaded. This procedure is ideal in cases where you do not have a dedicated server; If there are any invalid URLs still active on your site , make sure to work the 404 or 410 codes. Set up a 301 redirect for pages that are present elsewhere on your site. Have your crawl stats plummeted? There are many situations that can cause crawl statistics to disappear , which makes it even more important to know some processes that allow Googlebots to find your site again! Sometimes, Googlebots cannot read CSS or Java files when the robots.txt file is modified .
This process is obviously fundamental to evaluate the type of information Peru Phone Number that Google collects from your site, such as CSS, JavaScript, Flash, PDF or image files. It should be remembered that Google Analytics cannot offer this same information, so it is important to also analyze search console data. You can see the statistics that the Google bot has collected in the Search Console. In the side menu you can examine the entire section. Under the Scan subheading, you will find the words “ Scan Statistics ”. Generally speaking, there is no "ideal" indexation percentage: you must aim for consistency . Of course, there are some cases where certain peaks can be seen (for example when adding a lot of content to the site). What happens when the statistics push up? In many cases – as we said previously – a spike in Google crawling could only indicate a sudden change in the amount of content present on the site. Unfortunately, an increase in the presence of Googlebots is not always a positive thing, as it could affect the speed and functionality of the site itself. If you just added a lot of new content you don't have to worry.
That said, there are some cases where the peak will not drop as quickly as you would like. When this happens you need to pay attention to these points: Make sure it is actually Google that is crawling your site and not another user; If you need to stop the scan quickly, you can enable HTTP code 503 for all requests; If you see this spike often, you can set a “ preferred maximum crawl rate ” to ensure your website is never overloaded. This procedure is ideal in cases where you do not have a dedicated server; If there are any invalid URLs still active on your site , make sure to work the 404 or 410 codes. Set up a 301 redirect for pages that are present elsewhere on your site. Have your crawl stats plummeted? There are many situations that can cause crawl statistics to disappear , which makes it even more important to know some processes that allow Googlebots to find your site again! Sometimes, Googlebots cannot read CSS or Java files when the robots.txt file is modified .