For example, all of the following could make someone appear on a “winners and losers” list.
- Accidental noindex or disallow in robots.txt
- Switching from HTTP to HTTPS
- Rewriting URLs
- Manual action applied
- Manual action lifted
- Disavow file being processed
- New site design being reindexed
- Domain name change being reindexed
- Sudden influx of links (good or bad)
- Site being hacked
- And more than a few other ways webmasters have of shooting themselves in the foot
Likewise, the weather algo reports can be pretty inaccurate as well. After all, they never picked up on the first slow-rolling Panda update. It wasn’t until a later Google confirmation that SEOs were aware it had started. And Google also confirmed it is unlikely SEOs will be able to tell when a new Panda rolls out.
There is also the fact that these weather reports are pretty secretive about what they track. Of all of them, Mozcast is probably the most open about it, and even that is pretty limited. That data can be pretty skewed, especially since they share so little information in the way of keywords tracked, markets they cover or even the geolocation tracked. But because of the unknown factors, the data isn’t as trustworthy as SEOs would probably like.
Also, John Mueller said that people can be seeing thirty or more experiments in the search results at any time. And that will also skew things. What if one of the weather report IPs happened to be in a major test that significantly shook things up, but happened to be one of the 1% that received it over a very short window. This could definitely – and majorly – skew weather report tracking. To overcome this, the weather report trackers would need to deploy many (many) IPs at the same time from multiple geolocations to overcome the experiments issue.
And when you compare the various weather reporting tools out there, they will often show very different results between them. It usually has to be an enormous spike for all the weather reporting tools to show it, but even then there is quite a difference between them. One might show a huge spike while another shows a baby spike. Sometimes one shows a spike that none of the others do. Again, this is all dependent on the data being tracked and what data they happen to see at the time they check keywords.
Ironically, Barry Schwartz asked Gary Illyes on Twitter as I was about to post this, about whether he felt the weather reports were picking up on wrong data when reporting on the volatility in the search results.
Illyes doesn’t go into more detail about why this is, but it shows it is an issue.
What does this all mean? As I said here, while weather reports and “winners and losers” can be interesting to look at, SEOs definitely should be taking the data with a grain of salt, and especially not be making any changes to their SEO strategies based on who the winners and losers are. And noting high temperatures in weather reports means you probably want to check your own sites and maybe the specific search results and market areas you follow. But it is quite possible what the tools are seeing and what you are seeing in your own market areas could be very, very different.
With so many factors coming into play that can majorly skew the data, instead look at your own sites that have changed – or to those who are posting concrete examples of URLs, keywords and Analytics info (not just pretty pictures of a Google Analytics traffic chart showing ups and downs). This will give you much more realistic and trustworthy data than a list of winners or losers that could have had any number of things happen in the rankings that have absolutely ZERO do to with any changes Google has made to their core algorithm.
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020