Google Penguin Update
The Penguin update is an update to the Google ranking algorithm, first carried out in 2012. Back then, the rollout of the update had massive effects on many websites. Google intended the update to fight webspam.
What was the goal of the Penguin update?
When did the Penguin updates take place?
- The first Google Penguin update was rolled out on April 24, 2012. It is also known as Penguin 1.0. Two data refreshes took place late that year. Here it was not the algorithm itself that was updated, but only the database required for it.
- On May 22, 2013 the second Penguin update took place in Penguin 2.0. Almost four months later Google once again carried out another data refresh.
- On October 17, 2014 Google began the Penguin 3.0 update. Until then all updates were manually executed.
- With the experience of the Penguin 4.0 update in the Fall of 2016 the update became a part of the so-called Google core algorithm. Since then further Penguin adjustments have taken place continuously, just like modifications to the Panda update, and are no longer manual.
Consequences of the algorithm adjustment for webmasters
Websites that do not adhere to the of Google are affected by the Penguin update (as an example, buy buying or manipulate links in some other way). In the earlier versions the Google Penguin update devalued the entire domain if webspam was identified. This site penalty could ultimately lead to exclusion from the . Nowadays the Penguin filter works on a URL basis; if the filter comes across webspam, this can lead to a ranking loss for individual subpages.
With the introduction of the continuous updates there is, however, always the chance that punishment meted out to affected websites are more quickly lifted. When the updates were still carried out manually, it was significantly harder to be released from the penalty – this often took right up to the following Penguin update.
The Penguin update had far-reaching consequences for the SEO world. If a large part of the work of SEOs before then was still active link building and receiving backlinks via guest entries or link exchange, link building has since become harder. Now further attempts are being made to use high-quality content to increase voluntary linking to your own website. At the same time, other marketing channels such as social media are being used in order to increase the range of the website and to thereby receive backlinks and traffic.
What forms of webspam does the Penguin update fight?
- Unnatural links: These backlinks can, for example, be generated by link purchase or link rent from link farms.
- Artificial linking: If mainly keyword links are set on a website, it is mostly a case of webspam.
- Quick link growth: If a website receives many incoming links quickly, Google may flag it as webspam.
What can webmasters and SEOs do if their site is affected by the Penguin update?
Those who find their website is affected by Penguin should immediately work on analyzing the link structure. Links from web catalogs or incoming links from link farms as well as can lead to a devaluation. A significant indicator that the Penguin filter has been activated is a warning of “unnatural links” that Google sends via the to webmasters.
In this case, prior link building efforts should be scrutinized and analyzed so that the consequences of Google Penguin is minimized for the affected websites.
One option is for the webmasters to write to the linking websites and ask for the removal of the backlinks. Beyond this, Google has offered an alternative with the Disavow Tool. Using the tool, backlinks can be directly declared as invalid.
Panda and Penguin – is there still a difference?
Since the two central directed against webspam have become part of the core algorithm, webmasters and SEOs are now scarcely able to precisely tell which element has just been updated. It’s no longer worthwhile for webmasters and SEOs to rely on techniques that worked even just a few years ago.
Google has used Penguin and to come to a point where webmasters and SEOs focus on user interests and optimization no longer solely takes place for search engines. Website operators are now more than ever motivated to create high-quality content and technically flawless websites. Short-term SEO measures or black hat techniques are becoming less and less effective or have already become ineffective.