How Penguin 2.0 Levels the Playing Field [Data]

How Penguin 2.0 Levels the Playing Field [Data]

If you’ve found yourself on a search industry website in the last few weeks, the odds that you’ve encountered a reference to the upcoming Google algorithm dubbed ‘Penguin 2.0’ are likely to be near 100%. Among other changes the algorithm update was to address, in a video released on his blog (below) just prior to the algorithm update Matt Cutts describes adjustments to host crowding, or clustering, in the SERPs. Host crowding is when multiple listings from the same domain appear throughout the natural search results for a particular keyword.

For example:

Google views this as a poor user experience and prefers to offer the user greater diversity in the search listings:

Adjusting Strategies Based on Large Data Sets

Here at Conductor we are fortunate to have access to a rich database of natural search data in our Enterprise SEO Platform, Searchlight. As Penguin 2.0 has been rolling out, we’ve spent a bunch of time digging through the data to uncover its impact.

For One Large Branded Search Query, We See A 75% Reduction in Visibility

In working with the data in Searchlight, we’ve discovered that, true to his word, it appears Matt Cutts has in fact included adjustments in Penguin 2.0 that impact clustering.

In the example above you can see a ‘before’ and ‘after’ view of the top 100 rankings for bedbathandbeyond.com for the branded term ‘bed bath and beyond wedding registry’.  According to the Google AdWords Keyword Tool, this branded term has an estimated 9,900 global monthly visits. In the ‘before’ view, the domain bedbathandbeyond.com occupied 82 of the top 100 rankings.  In the after view, while the domain continues to rank for the first 7 results on page one, it has lost 75 additional listings throughout the first 10 pages, a whopping 75% reduction in coverage in the top 100 listings.  Clearly with this algorithm update, Google is making a big push to reduce the incidence of host clustering. We are seeing consistent limitations of 7 natural listings for the root domain on branded terms.

The Impact

This reduction of host clustering for big brands opens the door for affiliates and partners to compete for branded terms.  It also stresses the importance of brand management, especially through the use of third party sites like Facebook and Twitter.  With the remainder of this post, we will examine the impact to the SERPs, analytics, and strategies that Penguin 2.0 brings, and also provide insight on how to manage your content marketing in a post Penguin 2.0 landscape.

Impact in the SERPs

In digging deeper into the data in Searchlight, we can report that we’ve seen a wide spread impact in our KPI reports for the number of unique ranking URLs.  Some sites have lost over 75% of their secondary natural listings (even for branded terms).  In most cases the top ranking landing page has seen little to no movement.  The removal of those secondary results opens the door for additional sites to jump in and compete for better rankings.  This gives partner sites and affiliates a more level playing field in the branded term landscape and eases the barrier of entry. Think of it this way, if 75 ranking of the top 100 are removed, those spots are now filled by non-branded domains.  Big brands need to follow this very closely.

Impact on Analytics

The most important fact here is that overall visibility has gone down.  Keep in mind that the landing pages that have been impacted here are typically not the top ranking URLs for the domain.  Also keep in mind that most of the natural search clicks happen within the first 3 positions.  The total number of natural search visits from Google will likely go down but not by a massive amount.  In the companies we have talked to, the impact on visits was in the single digit percentages.  There can also be a small spike in conversion rates.  Higher-ranking pages tend to convert better then deeper secondary results.  The reduction of visits that do not convert will inflate conversion rates as a result.  The impact here is minor but still measurable.  These points should be on your mind when discussing your SEO performance at your next business review or reporting period.

Impact on Marketing Strategies

Content marketing is a strategy in SEO that has been in play for years.  Seasoned SEOs like to look at overlapping keywords that rank for a URL and determine if the company should build content around a grouping of similar keywords that are under performing.  For example, let’s say a page built for “widgets” ranks high on page 1 for “widget” terms but the page ranks on page 2 for all “widget installation” terms.  Search marketing has taught us that this is an example of where our company should build a page for “widget installation”.

Finding content opportunities in under-performing keywords has helped marketers build better customer facing content. Using Google as a measurement of relevancy helps marketers build content that converts higher and is better suited to what the user is looking for. It will now be harder to measure the immediate impact with the reduction of host clustering.

If we point our paid search efforts at this newly created “widget installation” page for the proper terms, we will see an increase in quality score and a reduced cost per click as a result.  This typically leads to increased traffic and better conversions.  From an SEO perspective, we will track this term in the SERPs and set it as our preferred URLs within our SEO platform for a subset of related terms.  Over time we will see the “widget installation” preferred landing page start moving up in the Google SERP and eventually out rank the page we did the initial analysis on.   This process is monitored and tracked and our efforts modified appropriately as the new content creeps up.

In the post Penguin 2.0 world, it will be harder to identify the impact of our efforts in natural search because we may never see the new preferred landing pages crawl up in the SERPs until the day they take the top ranker position for our domain.  This makes it difficult for companies who build content for a better user experience based on performance data.  These are cases where 301 redirects or canonicals are not an option.  We are taking a low level theme on the page and building a strategy for our users to find more detailed information.  Understanding how Google values this new page in natural search is now more difficult to understand.

Moving Forward

This new handling of host crowding in the Penguin 2.0 world should not change how search marketers execute content management strategies.  It will change how these efforts are tracked and monitored.  Marketers need the ability to track and monitor preferred URLs in their platform in order to identify when their new content has matured.

An easy way to manage this is by creating large categories of keywords that do not have preferred landing pages as their top ranking URL.  The categories should be grouped together based on the keywords you want associated with the page, as well as a master category.  These categories should be monitored in a custom workspace so the pages are not forgotten.  Once the preferred landing page has the top ranking position, the keyword should be removed from the newly created category.  This keeps these categories clean of any noise and by looking at the number of keywords ranking in the category over time, you will get a deeper understanding of how your efforts are progressing.

For example, in Searchlight we have 100 keywords in our “Widget Installation – preferred” category that have a preferred URL set to some content we published this morning.  At this time the top ranker is our home page.  In 2 weeks we see that the preferred URL is now the top ranker for 20 of our keywords so they are removed from the category.  When we look at a ranking pipeline chart with unranked URLs displayed we will see the total number of URLs in the category going down.  This is showing us the velocity at which our content is gaining relevant authority in Google and the impact of or execution.

Conclusion: Keep Your Eyes Peeled and Adjust with the Changing Tide

We know that Google will continue to change.  As a result we need to change our strategies and the way we use platforms like Conductor Searchlight to better understand our landscape moving forward.  Those who can adapt to change quickly as opposed to being a fast follower will find themselves best positioned to succeed as the search landscape continues to evolve. Please leave a comment and let us know if you see anything different.

About Brian McDowell

Brian McDowell is the Director of Search Intelligence at Conductor. He spends his days reverse engineering Google algorithms and is a frequent speaker at Pubcon, SMX, SearchExchange and Inbound Marketing Summit.

Related Posts