What Caused Your Site Traffic to Drop? An SEO Forensics Manual

What Caused Your Site Traffic to Drop? An SEO Forensics Manual


One of the most challenging parts of managing a website is identifying the root cause of massive site traffic changes from a particular source. To detect the cause, you’ll need to use a scientific, forensic approach.

One thing to keep in mind when doing forensics on a site is to think like an autonomous machine and not be biased towards any technique that may have been working for years.

I wanted to share a recent case study that should provide value to the day-to-day activities of a search marketing practitioner. In this case, the site mentioned had a massive drop in natural search traffic. Here’s the overview, from our Searchlight SEO platform:

keyword pipeline

One thing to keep in mind when doing forensics on a site is to think like an autonomous machine and not be biased towards any technique that may have been working for years.

Possible Reasons the Site Traffic Decreased:

  1. A Manual Penalty – this is when a member of the search engine webspam team physically demotes the ranking of specific parts of your site (individual pages, tactics or the entire site).
  2. An Algorithmic Penalty – this occurs when your site has been impacted by changes in search engines’ algorithms (Panda, Penguin, Hummingbird).
  3. Filtered Impact – if there is a sudden change in marketing strategies or page level infrastructure (a ranking page suddenly serves a 404 – page not found error).

Penalties typically take more time to recover from.  In the case of a manual penalty, Google notifies you on Webmaster Tools (Search Traffic -> Manual Actions). To get your rankings back, you will need to identify the root cause and file a reconsideration request, although they can also resolve themselves with an expiration date.

Algorithmic penalties often require a massive shift in strategy and can often times be difficult if there is a legacy of specific tactics that were abused.

A filter is often the easiest form of ranking dilution to recover from.  Filters can often be blamed on algorithms or penalties if we don’t know where to look.

After you determine the pattern of ranking loss:

Once a pattern of ranking loss has been identified, the first thing to do with any forensics project is to start with your analytics.  What you are looking for is evidence that the impact is specific to a singular referrer, or is felt across sources.

If the impact is felt only by one entity (Google, for example) then we are looking at something along the lines of an algorithmic or manual penalty. Having the proper platform and reporting in place prior to site traffic changes will make a world of difference when building a plan of attack.

If your forensics shows an impact to multiple search engine referrals, then it is time to start digging into your own site.  The first set of data you will want to look at is a delta comparison of your rankings prior to the change and how those rankings are performing now.

Many times a marketer will look only at the keywords themselves, and that is not enough.  A thorough look of both the keywords, their rankings and the associated landing page should be done.  One of the most commonly overlooked data points is the landing page that has lost its rankings.

Having the proper platform and reporting in place prior to these impacts will make a world of difference when building a plan of attack.

Google search rank

An SEO Forensics Case Study

Recently, a client had reached out to us, fearing that their site had been impacted by a Panda update.  They had lost a substantial amount of rankings in the hyper-traffic positions of 1-3 in Google.  Searchlight showed the same data, and it appeared as though many of their ranking keywords dropped to page two or lower.  Some of their existing rankings were off the map completely. This is where having the proper platform in place prior to the shift was crucial for their identification and recovery.

If your platform reports only on your rankings, you are missing valuable information.

Doing a delta analysis in Searchlight on a few select keyword drops showed a common issue.  What we saw was the top ranking URL disappearing from the SERP, while the new top ranking URL had not moved much.  Other examples showed that Google had issues determining which page on the client’s domain was most relevant, and as a result, the top ranking URL shifted between many pages.  

What appeared to be a slight loss in rankings was actually a complete drop in rankings for a specific landing page.

We then exported the entire list of lost rankings from Searchlight and quickly identified a pattern of two specific directory structures that correlated with the drops.  Doing a quick audit of all of these pages revealed that each and every one of them was now 301 redirecting to the home page.

03-Conductor-Recovery

If a 301 redirect is put in place between two pages that are not contextually relevant then pre-existing rankings are almost certain to be lost.  This is why putting a blanket redirect in place is never a good idea.  We often see this is e-commerce platforms when items are discontinued, or in redesigns where some pages are no longer live.

If a 301 redirect is put in place between two pages that are not contextually relevant then pre-existing rankings are almost certain to be lost.

The fix here was simple, yet required some work digging into the old URL structure to identify which pages were no longer live. These pages were either reinstated or redirected to a more relevant page in the new design.  As a result of this work the site started seeing the rankings climb back up.

Keep in mind that this process does not take over night.  It all depends on the crawl rate of the page that has the redirect. Typically, you should start seeing measurable updates within a couple of weeks.  On a positive note, the extra time and care put into fixing these issues resulted in better rankings across the board just in time for the holiday season!

About Brian McDowell

Brian McDowell is the Director of Search Intelligence at Conductor. He spends his days reverse engineering Google algorithms and is a frequent speaker at Pubcon, SMX, SearchExchange and Inbound Marketing Summit.

  • http://www.presseblog.at/informationcommunication/ franz_enzenhofer

    there are many things missing in this post. also the most important one: step 1) make sure you didn’t screw up? is your site still reachable? are all of your pages still reachable? really? also from outside your office (try VPNs, try “fatch with google”), check google webmaster tools errors (crawl errors, HTML errors(duplicate title)). check your server error logs? any suspicious HTTP 404, 410, 500, 301, 302? how fast is your site currently? is it much slower than before? google your own title, is there a subdomain in the wild that should not exist, check your canonicals, sitemap.xml, did the dev team deploy a robots.txt that was meant for staging?

    in 98% of all cases, it’s neither a manual penaltiy, a demotion, an algo change, or a filter. it’s you! in 1% it’s not you, but somebody else with access to your webserver. in the other 1% it’s google.

    • http://www.conductor.com Brian McDowell

      franz_enzenhofer I could not agree with you more. Forensics is an activity where there are never any “givens” and is addressed individually based on the signs. There are so many tactics and strategies that can impact your rankings that companies are moving toward a deeper awareness of how their code is released. Back end and front end architects are becoming more aware of how they impact inbound traffic. Your points are valid in additional areas one should check when going through your forensic progression. Thanks for sharing!

    • Brian McDowell

      franz_enzenhofer I could not agree with you more. Forensics is an activity where there are never any “givens” and is addressed individually based on the signs. There are so many tactics and strategies that can impact your rankings that companies are moving toward a deeper awareness of how their code is released. Back end and front end architects are becoming more aware of how they impact inbound traffic. Your points are valid in additional areas one should check when going through your forensic progression. Thanks for sharing!

  • billslawski

    Good example of why a site might lose traffic in a way that might not have been anticipated.

    Another couple of reasons for drops in traffic: (1) Competitors might have made a number of positive changes to their sites that increased their rankings or made the snippets showing for them more persuasive, and (2) searcher behavior might have changed due to a number of possible reasons, such as a change in the terms they were looking for (a television blitz might have influenced them to search for the same or similar products by a different name for instance), or the existence of new and engaging and attractive alternatives (Did Tivo take a hit in search traffic after Chromecast was released, for instance).

    • David Zimmerman

      We need to consider Occam’s Razor here: simple explanations are more likely. Like Bill mentioned, there could be other things behind traffic drops. I think we SEOs jump too quickly to think of a penalty. Far be it for anyone to out SEO-us, too.

      The first thing I check when I see this is to make sure analytics is working correctly. Then I check to make sure paid search was tracking accurately. Recently I encountered a new client with a drop around the end of June, “Penguin!” I thought. Turns out they re-launched a website without redirects.

      I saw one client’s traffic drop consistently over months. My first though: Panda due to dupe content. Turns out that Google had been filling-up their favorite SERPs with more and more paid ads- so many that even the #1 organic listing was below the fold.

      That being said, if we can rule-out the simple things then it’s time to consider penalties. That’s where your article is most helpful, Brian.

    • Brian McDowell

      billslawski I am a competitor stalker at heart. Site owners should never underestimate the power and strategies of their competitors and we should all be monitoring the space on a regular basis looking at market share categorically. Your second point brings up a great topic that not only do users search queries evolve over time but the result of that is a dynamic trend in Google serving what they feel is appropriate. Your writeup about how Google May Rewrite Your Search Terms is a great example of that. You can read it here http://www.seobythesea.com/2013/12/rewrite-search-terms/

  • DanielPageASEO

    Hey Brian- excellent article! We included it in our Monthly Resource Roundup http://www.aseohosting.com/blog/2014/01/seo-content-marketing-and-social-media-the-best-of-december-2013/
    Cheers and keep up the awesome work!

    • cstebbins

      Thanks, guys!

  • Spook SEO

    Competitive analysis can also provide a lot of information which can assist you in better diagnosing the exact reason of traffic drop of the Website. Google webmaster tools also provide the information about the SEO history and also about multiple factors regarding SEO.

  • Nathan Brook

    Excellent post. You have explained a great point. The two biggest challenges generally involve lack of focus and lack of time

  • Rebecca Boyle

    When we updated our site we used a new CMS (drupal). Previously all our URL’s had essentially been junk, the name of our website/the name of the CMS/lots of numbers, letters and symbols. Now the URLs are the site name/page name, but our search traffic has dropped by half. Would this have been the cause or should I be looking elsewhere?

    • Brian McDowell

      Rebecca, yes this could definitely be the case but a 50% drop in traffic is quite alarming and would require some deep forensics. The drop could be caused by orphaned pages no longer being served (now serving a 404 page not found error) or equally as bad having a large number of pages that are blanket redirecting to the home page (as an example). I would check all of the server headers for all of your old pages and make sure they are serving 301 permanent redirects. One last item to check for sanity purposes is your new robots.txt file. I am assuming that you are using the drupal default with some modifications however I have seen examples of companies copying over a robots.txt file from their staging environment that caused some adverse performance issues. I would also recommend starting with a page level audit in analytics to determine if the issue can be isolated to specific pages or directories. You will also want to do a sanity check and look for any manual actions against your site (in Google Webmastertools) and also see if the site relaunch coincided with any measurable algo changes.

      • Rebecca Boyle

        Thanks Brian, I’ve been checking the 404 errors and will be providing a comprehensive list of those still appearing to our web developers. I’ve also run some analytics to ensure that the Google code has been put in correctly and all the pages are being tracked. I personally think we’re getting the same traffic as before but for some reason Google isn’t seeing some of it.

  • Basant Patidar

    Hey Mr. McDowell your post is excellent but i cant understand that why my website is getting low traffic. can you help me sort out my problem?

    • Charity Stebbins

      Hey Basant, sorry to hear about the trouble with your site! It could be any number of things, like Brian listed. Unfortunately we can’t help you diagnose specific problems (we would need access to data for that) but I’d recommend talking to an SEO agency or consultant about that. We’ve got some amazing agency partners that we’re always happy to recommend!

Scroll To Top