While many SEOs put a great emphasis on having technically flawless sites, the reality is that many website owners have sites with plenty of technical problems and issues. And many believe that a technically perfect site will outrank a competitor’s site with errors. In the last Google Webmaster Hangout, John Mueller said that in general, technical problems on a site are not a ranking problem.
The question: Would there be a small ranking benefit if we compared the same site once with a lot of 404 and soft 404s present and other technical problems such as wrong hreflangs and no canonical tags in comparison to the same site in “perfect” technical condition ?
Here is his response.
Again there are the two aspects here. On one hand the crawling and indexing part and on the other hand, the ranking part.
When we look at the ranking part and we essentially find all of these problems then in general, that’s not going to be a problem. Where you might see some effect is with the hreflang markup because with the hreflang markup we can show the right pages in the search results, it’s not that those pages would rank better but you’d have the right pages ranking in the search results.
With regards to 404s and soft 404s, those are all technical issues that any site can have, and that’s not something we would count against a website.
This also confirms what Gary Illyes said on Twitter earlier this week that 404s do not cause a Google penalty.
On the other hand, for crawling and indexing, if you have all of these problems with your website, you have a complicated URL structure that’s really hard to crawl, that is really hard for us to figure out which version of these URLs we should be indexing, there’s no canonical, all of that kind of adds up and makes it harder for us to crawl and index these pages optimally.
So what might happen is we get stuck and crawl a lot of cruft and then not notice there’s some great new content that we’re missing out on. So that’s something that could be happening there.
It’s not that we would count technical issues against a site when it comes to ranking but rather that these technical issues can cause technical problems that can result in things not being processed optimally.
Bottom line, fix your technical issues because there are definite benefits in helping Google understand the correct content to rank. But technical issues in itself won’t cause additional ranking problems.
Added: Obviously, there are a ton of other issues outside of this question that can cause major ranking issues… SEO experts are always discovering new ways that webmasters have managed to cause issues with ranking, along with some of the usual issues seen, whether it is accidentally blocking Googlebot from crawling a site or not bothering to upgrade known known exploits in popular WordPress plugins that causes the hacked warning to show up in the search results.
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
Alan Bleiweiss says
Once again John Mueller completely fails to communicate factual information in a clear enough way that the vast majority of site managers and SEOs need to hear. I don’t honestly know why he does this – either he is reckless intentionally (I doubt this very much) or he just wants to be nice and answer questions people ask him, even though he has NO CLUE.
Technical issues DIRECTLY impact rankings. I’ve been doing SEO for 16 years, and audits since 2007. I’ve specialized in audits for three years.
If enough technical issues exist, there are countless ways rankings will plummet.
In just ONE example, if, as John references, there are crawl problems, Google will become massively confused. Except the impact will NOT be limited to his claimed “we might not find new shiny pages” issue.
Where does he think massive exponential duplicate content comes from? Obviously he knows it’s from horrific site architecture, among other things.
And I absolutely guarantee you that if a site has exponential duplicate content from critically flawed architecture, that’s going to cripple most sites at least to some degree in rankings.
There are so many other ways technical issues will inevitably harm rankings it makes my head spin reading this post.
Jennifer Slegg says
That’s why I had included the exact question asked, so people had the context behind it. But I did add that there are many other ways webmasters can cause technical issues that do affect rankings as well.
Alan Bleiweiss says
Here’s another:
Have enough page processing speed flaws? You bet that’s going to directly weaken rankings.
Even Matt Cutts himself said at his last SMX Advanced appearance, that as a general rule, if pages take longer than 20 seconds, rankings will suffer.
That’s an ENTIRELY technical issue.
Joe Preston says
This is helpful and provides some clarity. What I would expect is that soft 404’s would cause a wasteful use of your crawl “budget” with googlebot. Ilyes’s statements kind of muddied the waters for me, because for me “crawling and indexing” vs “ranking” aren’t as important to keep in mind as separate processes as they are to a Google engineer. If the content isn’t junk, optimal indexation is a key component of optimizing organic traffic (for a sufficiently large pageset).