On the 3rd of February, I was having a lunch meeting with a client, discussing SEO audit implementation and we picked up a spike in traffic for 2 days in early February. While I’m not going to go into whether it was a negative or a positive, we couldn’t make heads or tails of it.

After checking MozCast and Algoroo, I came to the conclusion that there was definitely some algorithmic activity on the 5th of February, 2015 and on the 17th of January, 2015, which Algoroo labels as Epoch 8.

 

 

 

 

 

 

With no further information going live and refreshes for Panda and Penguin due at any time, I proceeded to put on my analytical hat and checked my client’s site for key elements that Google’s algorithms are known to target.

The process I went through is something most SEOs will default to, so I thought I’d break it down into steps you can carry out yourself.

1. IDENTIFY KEYWORD DROPS

After you’ve checked your traffic, the next thing you should check is rankings and isolate pages that have drops of at least 5 positions or more (any page with big drops out of the Top 50 should be on your hotlist!). Gather at least 5 pages and obtain the indexed landing page and any history of ranking results you have for this keyword.

Competitor tracking really comes in handy here as you can use your competitors as a benchmark to see if you’re the only one who’s been affected, or if it’s something others have experienced as well. Some good keyword trackers include Authority Labs, Ginzametrics and SearchMetrics.

2. REVIEW WEBMASTER TOOLS

Webmaster Tools is Google’s primary method for letting webmasters know of issues to their site. Check the following elements to identify if there’s a problem with yours:

  1. Manual Penalty – Google’s Web Spam team often conduct manual reviews of websites and if it doesn’t meet their Quality Guidelines, you could be hit with a penalty. To check this, go to Search Traffic > Manual Actions in Webmaster Tools.
  2. HTML Improvements – If Google notices any HTML errors or areas for improvements, it will show them at Search Appearance > HTML Improvements.
  3. Crawl Errors – If Google can’t see your website or it’s being obstructed, you’ll find the issues listed in Crawl > Crawl Errors.
  4. Robots.txt Tester – This is how you can see if there’s any issues with Google accessing your site. Simply paste your URL into the field and hit TEST to see if Google can access your page. This is found at Crawl > robots.txt Tester.
  5. Sitemaps – Always a good idea to check your sitemap to see everything is in here and there aren’t any errors. This is found at Crawl > Sitemaps.

Webmaster Tools Analysis:

Despite having just completed an SEO Audit, I checked through the above points again, but found little to suggest anything in Webmaster Tools was causing the fluctuations in rankings. Webmaster Tools ruled out!

3. CHECKING FOR PANDA (THE LOW QUALITY CONTENT KILLER)

Panda has highlighted how critical quality content is to search results. In its last confirmed release, Panda 4.1 was the 27th iteration of the algorithm update, which rolled out September 25, 2014. Its predecessor, Panda 4.0, was released 4 months prior in May 2014. So 4 months on from Panda 4.1, you could expect Panda 4.2 (or even 5.0 if it’s significant enough) to hit the algorithm at any time. But did it happen?

Panda HATES Duplicate & Thin Content and low quality sites that have an excessive ad to content ratio. To detect a Panda trail, I’ll use Siteliner to perform a Duplicate content test on the pages with keyword drops and load the page to ensure it’s got enough body content on the page.

If you see high amounts of internal duplication or word count is less than 150 words on a page that’s had significant ranking drops, Panda may have turned it into bamboo and started chomping away on it.

Panda Update Analysis:

After checking all affected pages for Duplicate and Thin content issues, I couldn’t see anything that hinted at the presence of a Panda penalty. So Panda is ruled out!

 

4. CHECKING FOR PENGUIN (THE SPAMMY BACKLINK PUNISHER)

Penguin was Google’s answer to link-based manipulation of its algorithm. The latest confirmed update was 3.0, the 6th version of the algorithm, which was described as the ‘Refresh’ and launched October 17, 2014. As many speculated, and John Mueller confirmed, this update will make Penguin act more like Panda in its more regular updates (instead of 2 major updates annually, as we received in 2012 & 2013). Google has also said it won’t be announcing future roll-outs, prompting high levels of suspicion when we see fluctuations in search results.

The main clues to look for when detecting a Penguin slap are over-optimised anchor text profiles and unnatural link building. To detect this, I’ll use Ahrefs to check recent link activity and if I see something like this it will signal red flags:

The problem with Penguin is you can be hit by a Negative SEO campaign and never know it’s coming. It has become so easy to hit someone with an SEO attack and unless you’re actively checking your link profile, there’s little you can do to prevent this. Backlink Monitor is a cool tool that will send you email alerts when you’ve acquired more backlinks and will even flag high-risk domains. If you’ve noticed poor quality links, it’s time to do a backlink cleanup.

Penguin Update Analysis:

The next Penguin refresh is something everyone in the SEO community is expecting. For this case, however, we’d already completed a HUGE backlink cleanup and disavow at the end of November, so the jump I’d be expecting from a Penguin refresh would be a significant one.

Again, I checked all landing pages with big drops for malicious link building, but didn’t come up with anything suggesting a Penguin refresh could’ve been the cause here. Penguin is ruled out!

 

5. NEGATIVE SEO ATTACK CHECK

As previously mentioned, Negative SEO can be executed very easily these days, but we already covered the detection of the traditional spam link attacks in the Penguin checks.

As link attacks had been ruled out, I started thinking about an article I read by Bartosz Góralewicz that discusses Linkless Negative SEO, including such hack-tactics as server response overloading and clickbot attacks.

To summarise, Bartosz saw a case where a client was seeing heavy server congestion and suspected several dynamic IP addresses were crawling the site in order to slow down response time and cause issues. He also suspected clickbots were giving Google false-negative engagement signals by clicking on all search results except for his client’s page.

He decided to program a clickbot to click on all pages except for his in an effort to test if this type of attack was possible. As you can see from the below rankings, the clickbot did cause his positions to drop below the fold of the first page! Definitely a possibility!

 

Negative SEO Analysis:

Seeing no suggestive server congestion and with keyword positions only affected by a few places (not completely off the grid as you’d expect from a click bot) I had to rule this out as well.

 

6. GOOGLE IS BROKEN?!?!

Just when I was on the verge of giving up, throwing my hands in the air and declaring the Google algorithm broken, I decided to do some research to see if other SEOs had made any developments or discoveries into the nature of the algorithm changes.

A recent article on Search Engine Land cites the fluctuations I saw in Mozcast and Algoroo and reached out to Google for comment. Below was the response:

[testimonials backgroundcolor=”” textcolor=”” class=”” id=””]

We asked Google about this and the company told us that, “as usual,” it is “continuing to make tweaks” to the search results but at this time, it has no “specific “update” to announce.” Google also confirmed that whatever the community and the tools spotted, it was unrelated to the Google Panda Update or the Google Penguin update.

[/testimonials]

 

CONCLUSION

So all this just turned out to be tweaks to the general Google algorithm, nothing Penguin or Panda related and, while some results still remain obscurely altered, traffic has returned to roughly the same as what we’d been experiencing before.

So I guess that gives you some insight into the detective work SEOs carry out when we see spikes or drops in rankings, search traffic or conversions.

Don’t have time, skills or tools to do this type of analysis yourself? Talk to Search Factory about our SEO services today. We’ll monitor your website and take positive steps to ensure you don’t get hit by algorithmic penalties.

 

 

BONUS MATERIAL!

After going through all that reading you deserve a treat! Check out our Penguin and Panda video below! Remember to share it if you liked it!