So Googlebot will slow crawling once it begins to see 5xx server errors being returned.
For many sites, server errors tend to be for only a short period of time, such as a host being down, or a CMS update being pushed, and there would not any long term issues. This happens to many websites from time to time.
For other sites, serving server errors can be a chronic issue and are a sign something is going on with the site. Sometimes a site is serving just Googlebot 5xx server errors due to a misconfigured site firewall for example. Or a site blocks traffic from certain countries, including ones Googlebot can crawl from, due to errors implementing geotargeting.
Regardless, if Search Console shows server errors, you should investigate them to make sure it was only a temporary problem and not a bigger issue. If Search COnsole continues to show errors, you should do some fetch and renders, as that will show exactly what Googlebot is seeing, and can help you investigate the errors if there doesn’t seem to be a problem when you visit the site.
Unless the server errors were a long term issue, Google should continue to visit the site to see if the server errors have been resolved and return to crawling as usual. The crawl rate should go back to what the site usually sees after Googlebot is no longer seeing the errors.
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020