Bartosz Góralewicz recently posted a case study where two different sites seem to be impacted by negative SEO. But the big difference with these negative SEO attacks is they were done without the use of any backlinks.
Something was seriously wrong with two of my clients’ sites. They weren’t suffering from Panda or Penguin issues. Their sites were healthier than most of their competitors. Still, I spent 2 months fixing every issue I could think of and still no rankings had changed. Something else, something new was suppressing their search visibility.
I explored everything I could. Their backlinks were clean, their citations were accurate and their on-page elements were also properly optimized.
Normally when a site has issues ranking in Google, one of the first things that SEOs always look at is the quality of the backlinks. If there is a competitor with its target on you, the possibility of that competitor pointing loads of crappy back links that your site is real. But now imagine an attack that doesn’t leave the nice and obvious footprint of loads of low quality links. Góralewicz, along with other SEOs, feel that this backlink-free negative SEO attack could be a way to negatively impact competitor’s websites without leaving that traditional backlink footprint of the most common negative SEO attacks.
Góralewicz highlights two different scenarios. And unfortunately for SEOs, they can be very hard to detect, especially if you don’t realize the significance impact they can have, as they did with his clients.
The first is where a competitor’s body was putting a severe load onto the server. This resulted in bots and visitors being negatively impacted with extreme load times. Since Google has publicly said that site speed is considered as a ranking factor, so you can see how the excessive load created by negative SEO bot on the server could cause ranking issues. He also points out the fact that the bot was running in the middle of the night, so poor server performance issues went unnoticed.
The second negative SEO instance was brought about by another type of clickbot. This clickbot was apparently designed to click on results of all the competitors, except for his client’s site. This meant that Google fell there must be something wrong with that particular site because no one was ever clicking on the search result for it, and push down its rankings. This is also referred to as a bounce attack, as sometimes the bots will click through to the competitor’s site, then immediately click back and go to another search result, to alert Google to the fact the user didn’t find the page useful for their search query.
Góralewicz replicated this by testing on his own site for the keyword “Penguin 3.0” and he showed screenshots that show how the use of the clickbot did hurt his rankings for that keyword.
He also reiterates how important it is to track rankings for your important keywords. Because site traffic fluctuates so much, it’s not always the best gauge when trying to pinpoint this type of attack, especially since a clickbot in the search results won’t be seen in your own logs. But ranking fluctuations, especially for competitive keywords that cannot be attributed to a known algorithmic reason, could possibly be this new type of negative SEO attack.
What can be done to help prevent or reduce the damage done by these types of attacks? Definitely make sure your site is as lean and bloat free as possible. Keep track of server load times, and being sure to check during all hours, not just traditional “business hours” since the attack he highlighted only hit in the middle of the night. Make sure the site is linked to Google webmaster tools, as they have a lot of data to alert you to things like if Googlebot is having issues crawling or simply to alert you of any penalty given. And of course, practice great SEO.
Be sure to read the full analysis, including screenshots and data, here.
Jennifer Slegg
Latest posts by Jennifer Slegg (see all)
- 2022 Update for Google Quality Rater Guidelines – Big YMYL Updates - August 1, 2022
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
Mariusz Kołacz says
We have plenty of negative seo tactics, those that focus on on-site factors are even more dangerous than unnatural links (off-site). So, webmaster must not only monitor all activities but also eliminate all CMS “traps” (page optimization) which can be used by competition to invoke penalty on his website.
IrishWonder says
Only the most amateur attempts at negative SEO are actually link-based… Surely those are the easiest to diagnose and hence all the talk about negative SEO has been somehow concentrated on that type of attacks. However, there is so much more to negative SEO than just links – this is not new, tanking a site while not doing anything about links has always been possible. Gosh, now every newb will be shopping for a clickbot just like they’ve decided they all “know” how to do neg SEO after a slew of articles about link spam negative attacks *rolls eyes*
Jennifer Slegg says
Yep, for highly competitive markets, all kinds of nefarious negative SEO has been going on for a long time, but usually it isn’t discussed openly. It’s mostly the wannabe SEOs who have heard “pointing bad links at competitors will get them penalized” is the way to win.