John Mueller from Google answered a question about the report, specifically why there are changes for a site that hasn’t made any changes, yet the report shows these wild fluctuations.
The fluctuations aren’t necessarily from a site undergoing a lot of changes, needing more crawling. It is based on the actual pages, and if some pages are bigger, or have more resources that requires Googlebot to process, you can end up with a report that has large swings in the time spent downloading a page.
Of course, the reason can also be specific to the site, such as the site’s server being slower on responding to Googlebot crawl requests on a page as well.
Here is the tweet:
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020