Gary Illyes responded to an interesting question on Twitter which questions whether Google puts more focus on keywords with higher search volume over those keywords with a lower search volume.
@Jonny_J_ No.
— Gary Illyes (@methode) July 30, 2015
In some respects, I could see why some would think this is a possibility. After all, some keywords are definitely more “important” than others, not to mention many high volume commercial keywords tend to get spammed a lot more. This also would not scale very well for practical reasons, especially when you consider all the different languages Google serves search results in.
That said, there has never been any evidence in the SEO community that Google treats some types of keywords any differently than others, simply because of their search volume.
The only thing we have seen is when Google goes after a specific spammy search vertical, as we have seen with things like the payday loan algo and the pirate algo. But again, that is not specifically going after higher versus lower keywords.
When Illyes was questioned a bit further, when someone said he was suspicious of such a short answer, Illyes responded… in slightly more detail in Google fashion.
@Jonny_J_ no we don't?
— Gary Illyes (@methode) July 30, 2015
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
MamboMan says
Gary seems to be a bit of a trickster.
I don’t believe he answered truthfully. At least not in the context Jonathan intended.
Scott Van Achte says
I had never thought about this as a ranking factor before, and while I am confident that they do not use search volume, I can see a good arguement to include it.
Lower searched terms (not including all the long tail stuff) are often very niche or geographically specific. “SmallTown Hotels” for instance would have far fewer searches than “New York Hotels”. For a hotel in New York to rank in the top 10 it would require bucket loads of inbound links, etc, but for a hotel in the middle of nowhere, links would likely be irrelevant, especially if there are only a couple hotels.
In cases like these it would make sense to have less of a focus on links and more of a focus on relevance to the location. Perhaps bumping sites like Expedia in favor of the actual hotel. Or lets take an example where a town has two hotels. A major chain with 1000 locations, and a mom-pop hotel. Maybe the mom-pop hotel is nicer, has more history, and their website is highly focused on the geographic area. In todays world, in most cases, the chain will likely outrank the mom-pop hotel. If you take links out of the equation, the mom-pop hotel would have a pretty solid chance.
So while I don’t see search frequency playing a role in rankings any time soon (if ever) I can see reasons why it may actually be a smart move – or at least favorable – for the small businesses going after those lesser searched terms.
(I wrote that before my coffee – hopefully it makes sense 😀 )