As part of Google’s redesign of Google Search Console, they have released a brand new version of their Webmaster Guidelines.
Here is a line-by-line of every change, addition and removal from the previous version. So let’s get started!
Contents
- 1 Introduction
- 2 How Google Finds Pages
- 3 Sitemaps & Human-Readable Sitemap
- 4 Maximum Number of Links Per Page
- 5 Robots.txt
- 6 Googlebot Reads Text in Images
- 7 <Title> and ALT Attributes
- 8 Clear Hierarchy
- 9 Images, Video & Structured Data
- 10 Broken Links & Valid HTML
- 11 Dynamic Pages Change
- 12 Session IDs & Parameters
- 13 Content Management Systems
- 14 Blocked Resources
- 15 Content Visible by Default
- 16 NoFollow Advertisements
- 17 Page Loading Times
- 18 Design for Multiple Devices & Browsers
- 19 HTTPS vs HTTP
- 20 Visual Impairments
- 21 Quality Guidelines
- 22 Overall Thoughts
Introduction
The first change is the ommission of a single phrase within their opening paragraph. Previously, it said “Even if you choose not to implement any of these suggestions” which has now been removed.
Here is the new version:
Following the General Guidelines below will help Google find, index, and rank your site.
We strongly encourage you to pay very close attention to the Quality Guidelines below, which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise affected by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google’s partner sites.
And the older version:
Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the “Quality Guidelines,” which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google’s partner sites.
How Google Finds Pages
Ensure that all pages on the site can be reached by a link from another findable page. The referring link should include either text or, for images, an alt attribute, that is relevant to the target page.
Previously, Google stated that the site should have “clear hierarchy and text links. Every page should be reachable from at least one static text link.” So they have changed that when you link via images, you should use an alt attribute for that image that is relevant to the page being linked to.
Also, the reference to static link has been removed. Since these guidelines were originally written, there have been many new ways to link to sites. And we have seen where Google finds URLs based on a non-linked plain text link.
Does this mean that Google is taking that alt image text into account when ranking? There has been evidence that Google has been doing this for quite some time, but this is more evidence of the importance of it.
Sitemaps & Human-Readable Sitemap
This is interesting. Google is recommending not just a sitemap for Google, but also a regular version for humans visiting the site.
Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page).
Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.
Maximum Number of Links Per Page
Another change is that Google now specifies a maximum number of links per page, while before it said to “keep the links on a given page to a reasonable number.”
Limit the number of links on a page to a reasonable number (a few thousand at most).
So we now know what Google considers to be reasonable.
Robots.txt
While most of this remains the same, Google is now recommending also using robots.txt for crawl budget purposes.
Use the robots.txt file on your web server to manage your crawling budget by preventing crawling of infinite spaces such as search result pages. Keep your robots.txt file up to date. Learn how to manage crawling with the robots.txt file. Test the coverage and syntax of your robots.txt file using the robots.txt testing tool.
Interestingly, they removed the part about ensuring you aren’t accidentally blocking Googlebot.
They also added the reference to ensuring you aren’t allowing “infinite spaces” pages such as search results to be crawled.
Previously, they did have a different section about using robots.txt to prevent useless pages from being crawled:
Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.
Googlebot Reads Text in Images
Previously, the guidelines warned against including text in images and expecting it to be read, and instead to make use of the ALT tag.
Here is what it used to read:
Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content, consider using the
ALT
attribute to include a few words of descriptive text.
And now:
Try to use text instead of images to display important names, content, or links. If you must use images for textual content, use the
alt
attribute to include a few words of descriptive text.
This is a pretty big change for Google.
<Title> and ALT Attributes
Another change is minor, but telling. Google is now including the word “specific” in this section.
Ensure that your
<title>
elements andalt
attributes are descriptive, specific, and accurate.
Clear Hierarchy
Google has slightly changed this (and added it to the link reference as well).
Design your site to have a clear conceptual page hierarchy.
Previously it said “Make a site with a clear hierarchy and text links.” This does put it more onto the designer and the design phase of a website. How often are SEOs stuck trying to SEO a beautiful but poorly-designed site for SEO.
Images, Video & Structured Data
Instead of merely asking webmasters to read the guidelines, they now ask them to follow those guidelines.
Follow our recommended best practices for images, video, and structured data.
Not a major change, but shows they are paying closer attention that they are being implemented correctly, especially at a time when everyone wants to see their structured data in the search results.
Broken Links & Valid HTML
Previously, Google said “Check for broken links and correct HTML.” They have now changed the wording slightly. In the new version
Ensure that all links go to live web pages. Use valid HTML.
And now they specifically talk about valid HTML, not just “correct”. This should make all of those promoting valid HTML happy!
Dynamic Pages Change
Previously, Google had instructions about dynamic pages, with the caveat that not all crawlers could follow them. We do know Googlebot is much better at this, so this could be why it was removed.
Here is the section that was removed:
If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
Session IDs & Parameters
Google did incorporate a bit about URL parameter issues into this section:
Allow search bots to crawl your site without session IDs or URL parameters that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
Previously, the first line read “Allow search bots to crawl your sites without session IDs or arguments that track their path through the site.”
Content Management Systems
Google has also added a line about using a CMS on a website, and ensuring that those URLs are crawlable.
Here is what Google added:
When using a content management system (for example, Wix or WordPress), make sure that it creates pages and links that search engines can crawl.
Previously, it didn’t specify specific CMS and was clearly written before the free WordPress became widely available.
If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
Blocked Resources
A slight change was made here, primarily to the language.
Here is the new version:
To help Google fully understand your site’s contents, allow all site assets that would significantly affect page rendering to be crawled: for example, CSS and JavaScript files that affect the understanding of the pages. The Google indexing system renders a web page as the user would see it, including images, CSS, and JavaScript files.
So it now specifies files that would cause the rendering of the page to be different.
And the old version:
To help Google fully understand your site’s contents, allow all of your site’s assets, such as CSS and JavaScript files, to be crawled. The Google indexing system renders webpages using the HTML of a page as well as its assets such as images, CSS, and Javascript files.
Content Visible by Default
Google is now stating that all content should be visible by default (which is ironic since many of Google’s help pages don’t do this).
Make your site’s important content visible by default. Google is able to crawl HTML content hidden inside navigational elements such as tabs or expanding sections, however we consider this content less accessible to users, and believe that you should make your most important information visible in the default page view.
This shouldn’t be a surprise to SEOs but it is definitely a growing trend among designers to hide content behind additional clicks to expand content.
NoFollow Advertisements
Yet another change, this one where Google is specifically stating that advertisement links should be nofollowed. They have also removed the statement about AdSense and Doubleclick, likely because it was confusing for site owners who thought they needed to block those.
Here is the new version:
Make a reasonable effort to ensure that advertisement links on your pages do not affect search engine rankings. For example, use robots.txt or rel=”nofollow” to prevent advertisement links from being followed by a crawler.
And here is what it said previously.
Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
So it is much more concise, especially for those webmasters who might not realize how advertising links affect rankings.
Page Loading Times
Google made some major changes to the wording here, stressing the use of their page speed tools to optimize speed.
The new version:
Optimize your page loading times. Fast sites make users happy and improve the overall quality of the web (especially for those users with slow Internet connections). Google recommends that you use tools like PageSpeed Insights and Webpagetest.org to test the performance of your page.
And the old version:
Monitor your site’s performance and optimize load times. Google’s goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.
So it has changed from Google merely hoping webmasters would improve page speed to a recommendation.
They also removed third party tool recommendations. From the old version:
Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let’s Make The Web Faster.
Design for Multiple Devices & Browsers
Google is now recommending that webmasters design for multiple devices, including desktops, tablets and smartphones.
Design your site for all device types and sizes, including desktops, tablets, and smartphones. Use the mobile friendly testing tool to test how well your pages work on mobile devices, and get feedback on what needs to be fixed.
They also have changed the wording of “Test your site to make sure that it appears correctly in different browsers” to “Ensure that your site appears correctly in different browsers”.
HTTPS vs HTTP
Yes, this is not surprising it has been added. From the new guidelines:
If possible, secure your site’s connections with HTTPS. Encrypting interactions between the user and your website is a good practice for communication on the web.
All last year Google was pushing webmasters to make their sites secure, so this addition makes a lot of sense in their goal of making the web secure.
Visual Impairments
Another new addition is Google asking webmasters to consider visitors with visual impairments.
Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader.
Quality Guidelines
This surprisingly is where Google HAS NOT changed anything. The lone change is they bolded the word avoid in “Avoid the following techniques:”
Overall Thoughts
While there are some interesting changes and additions here, the vast majority are common sense for most active SEOs. But it definitely goes into detail for those who are new to SEO or are just a webmaster to a single site trying to get it ranking better.
The addition of secure sites is one webmasters should be paying attention to, as are the new page speed additions. The writing is on the wall about where Google’s emphasis is on these changes.
H/T SERoundtable
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
Eric says
Good read, thank you for sharing!
Ken says
Typo..
“This surprisingly is where Google HAS NOT changes anything”
Very nice information. I had to chuckle at their video suggestions.
https://support.google.com/webmasters/answer/156442
“Return an 404 (Not found) HTTP status code for any landing page that contains a removed or expired video. In addition to the 404 response code, you can still return the HTML of the page to make this transparent to most users.”
They must mean a page that has a stand alone video, and yet they recommend…
“a standalone landing page for each video”
Never thought of that, but why? I prefer video as an addition to content on other pages.
Jennifer Slegg says
Probably for video search.
Niranjan Vibhandik says
Thanks Jennifer,
For a detail breakdown of what changes are actually made. Much appreciated !!!