The latest so-called “Fred” update has brought one thing to the forefront – SEOs can’t decide what Fred is and what specific tactic it is targeting. Is it links? Is it ads? Is it ad heaviness? Is it private blog networks, aka PBNs?
Many people are identifying symptoms of Fred, but I feel Fred is much broader in scope than simply “ad heavy” or “links” although those are clearly thrown into the mix. All signs point to Fred as being more of a next generation quality algo identifying various aspects of sites that make a page or site low quality, and then demoting accordingly. In other words, sites that are created to benefit the site owner and created for Google SEO purposes, but not so much the end user who ends up on one of those pages from Google search results.
Contents
Fred Symptoms
Barry Schwartz was definitely on the right track because he noticed some of the lower quality characteristics that these sites had, but it doesn’t explain all of the sites that were impacted by Fred. For example, many people have connected the update with links and the use of either paid links or a private blog network (PBN). But then others were certain links were not the root of Fred traffic losses but could have been more on the quality aspect of it, such as bombarding visitors with ads.
Pretty much every algo piece we know about has been mentioned by at least one SEO in conjunction with Fred, including Panda, Peguin, “above the fold” algo, Pirate, etc. But again, Fred doesn’t seem to fit nicely into any of those, at least not when you look at it from beyond a single site. When looking at a greater collection of Fred impacted URLs, it seemingly makes no sense – on the surface.
Fred Through a Broader Lens
I think SEOs need to look at Fred through a broader lens, and you will see that all the sites have one thing in common – no matter how well the site owner tries to disguise the site as being one that isn’t used solely for affiliate/ad revenue or links – they were primarily designed with Google in mind and not users who might end up on those sites. In other words, those sites benefit Google and the site owner, but they don’t benefit the average Google searcher who sees them in the search results.
Now, Google targeting low quality is nothing new. Low quality content is why they created Google Panda. The Fred update doesn’t seem to just target content. We are seeing sites impacted for the fact their primary purpose was to be used for link juice to “money sites” instead… again low quality sites for Google’s benefit only, since they were really only used for linking out.
Fred also seems to be targeting the type of sites that would be rated low or lowest according to the Google Quality Rater Guidelines.
Fred & Ads
For those impacted by Fred, sure the solution could be remove ads, but that would likely only fix the subset of sites that matched the low quality markers Google targeted – after all, many low quality spammy sites tend to load them up with ads. But there are many higher quality sites that also load up users with ads that rank fine, so simply being ad heavy isn’t the root cause of the drop.
If you have a lot of ads and were hit by Fred, look at the site beyond the fact you have ads, and look at things like placement. For example, ads that disrupt content flow (ie those who have an ad block every 1-2 paragraphs of content) can be seen as a low quality marker since you are wanting people to click those ads instead of consuming the content – and this is a point the Google Quality Rater Guidelines make clear.
If your ads are not just obtrusive (all ads above the fold) but disruptive (such as ads every 1-2 paragraphs of content) that is definitely a sign of low quality in the guidelines.
Fred & Affiliates
Likewise with affiliate ads. There are many high quality affiliate sites out there, and Google has said on numerous occasions that affiliate sites can rank well. But there are probably significantly more affiliate sites that are low quality, duplicated thin content that provides no value to a searcher that lands there.
So if you do have affiliate content and were hit by Fred, then look at those characteristics that made Google see the entire page – as in the page beyond the fact it has an affiliate link – as low quality.
Fred & Links
Which brings us to links, another often-cited reason for Fred hits, as many – but not all – sites were used for linking purposes… either abused for incoming or outgoing links. But yes, quality plays a role here too. Were those links for the benefit of the user? Or were they for the benefit of Google and SEO? Was the site strictly used as a link vehicle to link to another more important money site? If the site and/or links were used for Google/SEO, then this is a problem.
Take a hard look at links on/to the site and look for low quality patterns that Google can identify, particularly ones that match PBNs for both the sites powering the money site, and the money site itself.
Fred & Content
Which brings us to content. In a lot of examples, the quality isn’t the best. Some of it is clearly put through a spinner – some better than others – because the reading of it seems a bit off. On hit affiliate style sites, it is merely cookie cutter content taken straight from a datafeed without attempts to make it unique or to give it any “added value”.
Now, added value? Many of these sites lack something else Google looks for in the Quality Rater Guidelines – supplemental content, things that aren’t part of the main content but still bring value to the page. Many of the examples are clearly lacking what would make a page more valuable than a similar competitor’s page.
Fred & SEO
Looking at Fred from an SEO perspective, it isn’t clear if overoptimization is included or not, since I did see overoptimization as one of the targets to an earlier update this year. However, if it was a precursor to what we are seeing today, then overoptimization could play a role.
And many sites do match SEO patterns when they are low quality. For example, with PBNs, you often see a very similar type of anchor text optimization, since they are often very careful to not have too many links with the same keyword – something that can actually look unnatural when you consider how the same thing looks on brand sites with many more links.
Low Quality Characteristics
Google seems to be getting smarter at identifying low quality characteristics algorithmically. These were all things that would often have to be captured manually by Google through manual actions for things like thin content, pure spam and links. This could also mean that it makes it harder for those sites to know what was wrong and fix
This could also mean that it makes it harder for those sites to know what was wrong and fix them since with manual actions it is often pretty clear cut with a path to be reindexed if the problems were fixed. But algorithmic suppressions are algorithmic, meaning if Fred caused a site to tank in the search results, there is a less clear picture of how to return. And a spammy site is still a spammy site if its sole purpose is for Google alone and not users.
Fred & Insights from Quality Rater Guidelines
The Google Quality Rater Guidelines are also invaluable for those trying to recover. Read up the Low and Lowest quality sections, as well as Needs Met, and determine what characteristics Google describes that matches your own site – and be certain you aren’t looking at your own site through rose-colored glasses… even high quality sites have room for improvement.
Final Thoughts
Fred seems to be all about site quality in its many forms, and shouldn’t be pigeon-holed as just “link related” or “just ad-heavy related”. And because of this, it resulted in SEOs not having a very clear idea of what Fred is. When looking at it closer, however, it is clear that this is a next generation quality algo from Google targeting sites that hold no benefit to a searcher landing on them.
And the manual action aspect of this is interesting since it does seem to encompass some of the issues that were often handled by manual actions.
If you were hit by Fred, or are concerned a similar algo or Fred refresh (if such a thing is possible) analyze critically who the site benefits most – Google and the site owner, or the user who ends up on the page through a Google search? If it is the former, you likely need to increase the quality to make it an example of “a great content site” or “a great affiliate site”.
And don’t forget, Google can adjust or turn up the dial on any of these algos. So if you weren’t hit this time, but you know your site has some of these low quality characteristics Fred seems to be targeting, taking the time to improve quality now would help Fred-proof (not to mention Panda-proof, Penguin-proof etc) your sites going forward.
But the future of Google and search is clear… create content and sites for users, not for Google.
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
Concerned Reader says
Much of this article is the same paragraph copied and pasted 2 or even 3 times in a row. Is that for Google or for the user? Great read otherwise!
Jennifer Slegg says
Oops, fixed. Copy paste error from copying sections from a Word document I originally wrote it in 🙂
Jim Stewart says
Jen have you seen a site’s ranking return based on fixing any of the above? We had one site ranking return so far by fixing index bloat (tag pages, params) and dupe titles. Content was already awesome but they got hit hard.
Jennifer Slegg says
There’s been people saying they’ve been able to recover, of course it is hard to know if it was changes or Google dialing something back/up on their end. But any improvements to quality such as that is always smart to do.
Jason says
Interesting read, Jennifer.
Can you explain then, in your opinion, how a site like BestProducts.com manages to still dominate the Google SERPs?
Clearly a site 1) built solely for affiliate revenue, 2) full of affiliate links, and 3) full of crappy thin content.
But Google loves them.
Thoughts?
Jennifer Slegg says
I’m not familiar with that site, but affiliate sites aren’t automatically considered low quality just because they are affiliate sites, something Google has said repeatedly. It is just that there are many low quality affiliate sites out there. I checked a random article there, and it seemed to be unique and not low quality. But you can file a spam report with Google for any site you think is ranking but shouldn’t be.
Wes Dunn says
Excellent article….I read a few articles by Barry Schwartz and agree in part with what he was saying, my site appears to be recovering slightly by changing my advert placement.. Personally whatever Fred is cant be bad as far as I’m concerned as it clearly targets low quality or things that may be annoying to the user experience. Personally I think Google have it totally wrong by placing so much emphasis on links, most links are probably not “earned” by writing high quality articles as we are led to believe. Having lived in Spain for 20 year I naturally have a blog all about the country and islands where I have either visited or lived., I dont have a ton of high quality links like those that have probably never visited the country but I still like to think I’m an authority on my given subject. 🙂
Richard says
I found this article very helpful as I think our WordPress site has been hit by Fred. Jim’s comment appears very pertinent too. Google suddenly started indexing our tags,, categories and author archives as well as portfolio items, all of which effectively had zero content. I’ve now no-indexed some of them and will add content to others. Is there anything else I should look at?
Jennifer Slegg says
That sounds like a good start to improve quality and noindex low quality pages until you can improve them. On the content side of things, there is a lot of information/help/ideas here: http://www.thesempost.com/understanding-google-panda-definitive-algo-guide-for-seos/
Nathan says
Thanks for the article, I purchased a site a couple of months back and haven’t been able to work on it. I noticed it went from around 500 uniques a day to nearly nothing. I’ve only just started out with websites so it took me a little bit to work out what had happened. My conclusion was a google update and the low quality definitely makes sense, I tthink the content writer used a lot of spun content. Lots of work to do to get it up and running again
Jennifer Slegg says
Yes, there is definitely buyer beware when it comes to purchasing sites, especially if the content is of dubious quality.
Gal Baras says
Thank you for the mature and considered summary. It’s rather rare and refreshing in the world of “let’s beat Google with techniques”.
Since many people use “thin” pages and sites for link building, it stands to reason that when a website goes down the ranks, this may be related to the now lower value of its inbound links, rather than its on page value. After all, even great content needs to be promoted, sometimes under pressure 🙂
From the very beginning, Google has indicated its intention to provide the best possible user experience and suggested aiming for that, rather than SEO. Predictions are that 2017 is the year of artificial intelligence, and since Google is pretty good at that, its ranking system is likely to become smarter and faster than human SEO consultants.
So really, the best advice is to think about the people we want to attract to our website and how to give them a good experience on our own site, as well as the sites linking to it, and everything will be fine, or as fine as it can be in such a huge and competitive world.