Over the past few years, Google has updated its search algorithm with a few major updates to crack down on spam, black hat SEO techniques, and other forms of manipulation.
These updates have been widely covered by search engine blogs and now each small update is covered in detail so SEO professionals can stay on top of the latest changes Google makes to its search algorithm.
With Google making more changes to its Penguin update here in October, now is a great time to get a refresh on Panda and other updates.
If you’ve only recently joined the world of SEO and have encountered discussion around these updates, these explanations will help you better understand the direction Google is headed so you can start optimizing your site for the future.
Panda – First Released: Feb. 2011 / Last Updated: Sep. 2014
Google’s Panda update was one of the first biggest algorithm changes to target low-quality content and sites with poor user experience. Also called “Farmer Update” and “Scraper Update” in the days after the update due to its target of sites with large amounts of low-quality content and sites that copy other sites’ content, Panda was reported to impact 11.8% of search queries in the U.S.
Unfortunately, the update ended up helping some scrapers, whose content would often get indexed before the original. Site owners began complaining about scraper sites ranking higher than the original, and Google has since rolled out updates to Panda. A tool to report scraped content was released in February 2014 to give publishers a way to tell Google about scraper sites that are outranking original content.
What Panda has taught site owners and professional SEO companies is that focusing on value, quality, and overall user experience will help boost rankings more than churning out a ton low-quality pages.
Penguin – First Released: April 2012 / Last Updated Oct. 2014
Unlike Panda, which targeted on-site content schemes, Google’s Penguin update went after inbound link manipulation. That included paid links, link exchanges, and link campaigns with keyword-heavy anchor text.
The primary purpose was to punish sites that pay for links (or host paid links) that pass PageRank, which is a violation of Google’s Webmaster Guidelines. For years, this was a fairly common technique among SEO companies, which probably ultimately lead Google to taking notice and action with Penguin.
With the update in effect, site owners are being pushed toward more natural linkbuilding techniques, such as content marketing. In the past few years, sites have been producing higher quality content as a result, and often receive higher quality links than through paid link campaigns.
Much like its tool for reporting scraped content, Google released a tool to report paid links. It allows users to give details and let the company know who is buying and selling links. If you’ve researched your competitors’ backlinks and noticed a number of suspicious links in other sites’ footers or labeled “Sponsored Links,” it’s a good idea to report them.
That type of manipulation could be costing your site, and other sites that follow the rules, valuable organic traffic.
Hummingbird – Released Aug./Sep. 2013
From a content perspective, Hummingbird is the most important of all the recent updates to Google’s search algorithm. It’s main goal was to make Google more intelligent in the way it returns results for conversational or longtail search queries.
Instead of focusing on keywords, Hummingbird delved deep into the content of a page to learn about the context. That allows Google to deliver results that are capable of answering a user’s question without that user having to return multiple times.
In turn, it also made keywords, which had previously been a primary focus of successful SEO content, much less important. Hummingbird delivered content that Google felt best represented the end goal of the user instead of one that happened to mention a specific keyword throughout.