How Google Algorithm Updates Can Affect Your Rankings

Each update to Google's algorithm (the way in which it decides how highly a website ranks for a search term) is designed to tackle a specific issue that Google sees as affecting it's search results in a way that it doesn't want. For example, if Google has found that a specific set of websites are ranked more highly than websites it considers to be of greater quality, it may examine what those websites have in common and release an update to place more or less emphasis on a different variable. "The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs," said Google's Matt Cutts. Here's a rundown of Google algorithm updates and how they impact search results:

February 2011 - January 2016: "Panda" Updates

Starting in February 2011, the Panda update was designed to target websites with low quality content, such as dynamically created pages where the only difference was a few keywords, or articles which contain very low quality text. That is, Google wanted to improve the quality of search results it returned and in designing the update used "Quality Raters" to answer such questions as "would I trust this site with my credit card?" when distinguishing between low and high quality websites.

Whilst Google's Matt Cutts said that sites affected by this update "are engaging in webspam tactics to manipulate search engine rankings", there is sufficient evidence to show that some legitimate websites such as property portals that can not avoid featuring the same property listings as other websites were heavily penalized. And, according to multiple sources, Google's Panda 4.0 update in May 2014 resulted in some major sites such as eBay seeing a significant drop in rankings, presumably as a result of the automated nature of it's web pages failing to meet Google's quality requirements. To help owners of affected websites, Google released a list of ways in which website owners can make sure that their website is considered to be of a high quality.

In September 2014 Google began rolling out another Panda update - one which Google described as featuring "a few more signals to help Panda identify low-quality content more precisely". According to SearchMetrics, some websites suffered from a drop of up to 79% in their search visibility in the United States as a result of this update.

In late July 2015, Google announced the start of a Panda update that would be rolled-out over several months and which would affect up to 3% of English-language websites. According to Search Engine Land, this update will slowly begin to remove penalties imposed by previous Panda updates, assuming that the owners of those websites have stopped attempting to manipulate their search engine rankings.

In January 2016, Google integrated Panda updates into it's "core" algorithm, signaling that changes in the way they prevent poor-quality websites from ranking well would now happen in an incremental, ongoing basis. This is instead of significant changes to the rankings of many websites occurring simultaneously on the same date.

In October 2016, Gary Illyes of Google told Search Engine Land that the Panda algorithm will analyze all of the pages of a website and may decide to demote it's search engine rankings as a whole, based on the site's overall quality. He said that if Google figured out that a site is successfully "gaming our systems" they would adjust it's rank and "push the site back just to make sure that it's not working anymore".

May 2012 Onwards: "Penguin" Updates

The Penguin update began rolling our in April 2012, and was designed to ensure that websites with artificial incoming links were penalized. Websites that were paying other websites to link to them or were involved in link schemes where vast numbers of reciprocal links were used were heavily punished by this update. According to Google, approximately 3% of search results were affected by this update.

In May 2012, Penguin 1.1 was released. While the original update concentrated on links themselves, this update concentrated on the anchor text used in links. In a scenario where even if both the linking website and the linked-to website were of high quality, and the link was not deemed to be artificial, the linked-to website suffered a drop in search engine rankings if too many of the incoming links used the same anchor text. A great example of affected websites is those operated by web designers, who tend to have many incoming links that contain the same anchor text. This Google update served to devalue such links.

Around the middle of October 2014, Google began releasing another Penguin update which lasted several weeks. Google's Pierre Far confirmed on his Google+ page that this update would give websites that had been penalized by earlier updates a chance to recover their rankings after cleaning up their artificial links (a process called "disavowing"),adding that it would also penalize websites with artificial links that had previously been undiscovered. According to Google, this update would affect less than 1% of English-language search queries.

In December 2014, Google revealed that websites which have been punished for having low-quality incoming links can actually recover from any penalties resulting from Penguin without actually "disavowing" all their poor links. "We look at it on an aggregated level across everything that we have from your website. And if we see that things are picking up and things are going in the right direction, then that's something our algorithms will be able to take into account," said John Mueller on the Google Webmasters YouTube Channel. What this means is that if your website has been punished in the past for having artificial incoming links, even if you have failed to act on them, your rankings in Google will improve if your website starts to feature high-quality incoming links.

Recovering from a Penguin-related penalty, whether by "disavowing" artificial links or increasing the number of high-quality incoming links your website enjoys, requires time. You will have to wait for another update to the Penguin algorithm or for the data to be refreshed by Google, a process that is not automated and is not carried out on a schedule. In June 2015, Gary Illyes of Google's Webmaster Trends Analyst team said that a refresh was "still some months away", but that the ultimate aim was to make Penguin and associated data update constantly. "Penguin is shifting to more continuous updates. The idea is to keep optimizing as we go now," said a Google spokesperson. This confirms that Google is now constantly striving to better identify websites that are engaged in artificial linking practices, and to punish them.

In September 2015, Google issued a warning to website owners that attract new penalties after successfully going through the consideration process - for example, a webmaster who removed an artificial link, submitted a reconsideration request and then after successfully being reconsidered, re-instated the artificial link. A Google spokesperson said that such "repeated violations may make a successful reconsideration process more difficult to achieve", which seems to seriously suggest that longer-term penalties will be applied to repeat-offenders.

After a wait of nearly two years, Google finally released a new update to it's Penguin algorithm in September 2016. This is the final update that will be released in this way, as the Panda algorithm has now been integrated into Google's core search algorithm and will therefore be processed in real-time. Any sites that had previously been penalized by Penguin will have had to wait until now to recover from these. Gary Illyes of Google told Search Engine Land that Penguin now "devalues or ignores spam links", as apposed to adjusting down a website's ranking.

September 2012 - October 2014: "Pirate" Updates

First released in September 2012, this update was designed to help Google take action against piracy on the internet by punishing websites that infringe on copyright or are the subject of filings with Google's DMCA process (more than 15 million URLs have been the subject of DMCA requests). Pirate Update 2 was released in October 2014 and has been extremely effective at targeting websites that, for example, host videos of copyrighted material such as films and TV shows, with some websites suffering a 98% drop in search visibility, according to SearchMetrics.

August 2013: "Hummingbird" Update

In around August 2013 Google released possibly the biggest update in it's history. Google named this update "Hummingbird" to reflect the fact that it would result in a quicker and more accurate calculation of what search results are best for any given search term. This update was designed to favor "conversational" search queries, in part to reflect the fact that many queries are now made verbally via smartphones, and resulted in less emphasis being placed specifically on keywords and more emphasis being placed on factors such as context, location and intent. As an example, the search term "where to buy pizza bases" might provide search results for grocery stores local to you (if your device is set to allow Google to know your location),where as a previous search might have concentrated only on the keyword "pizza" and offered websites that sell pizza.

June 2014: "Payday Loan" Update

In June 2014, Google announced a new update to it's algorithm, referred to as the "Payday Loan" update in the industry. Google's Matt Cutts said that this update impacted roughly only 0.3% of the U.S. queries and targeted websites connected to what they called "spammy queries" such as pay day loans, viagra, pornographic and other queries that suffer from a lot of spam search results. Several additional updates have since been released.

July 2014 - December 2014: "Pigeon" Updates

Around the end of July 2014, reports started circulating that Google had updated it's algorithm for search results with a localized element, which affects both regular search results and map search results (as of August 2014, this update is only affecting results in the English language and for results in the United States). This update primarily affects websites that want to rank highly for local searches, for example if someone is searching for "Italian food in London". As a result of this update, it seems that more importance has been placed on directory websites that list local businesses - in some cases, such as in the hotel industry, directory websites that list local businesses seem to be ranking higher than the actual local businesses that they list. But in other categories, such as pizza delivery, local businesses with high quality websites seem to be ranking higher than before.

Towards the end of 2014, Google confirmed that the "Pigeon" update had now rolled out to all English-speaking locales except India.

July 2014 - : "HTTPS" Update

In late July 2014 Google announced that it's algorithms would begin slightly favoring websites that are secured using HTTPS (HTTP over TLS, or Transport Layer Security). Google's Gary Illyes described HTTPS as being like a "tie-breaker" - when two websites are virtually identical in all other ranking signals, the website that uses HTTPS will be ranked higher. Google also announced plans to ultimately strengthen the importance of HTTPS and that they would like to "encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web".

April 2015 - January 2017: "Mobile-Friendly" Update

In February 2015, Google announced that it would begin to consider whether a website was mobile friendly when returning search results on mobile devices, in a change that Google itself described as "significant". Starting from late April 2015, it is expected that only websites that are considered to be mobile-friendly by Google will be able to attain high rankings in results for searches performed on mobile devices, with searches performed on other devices unaffected. In June 2016 at Search Marketing Summit Sydney, Gary Illyes also confirmed that Google are planning to make page speed more important on searches performed on a mobile device, hinting that this would probably come into effect by the end of 2016. And in January 2017 Google announced a change that results in poorer rankings for web pages that obscure content using interstitials on mobile devices.

April 2015: "App-Indexing" Update

By April 2015, Google claims to have indexed 30 billion links within apps that have "App Indexing" enabled. If you use an android device you will already have noticed that links related to apps that you have installed were included in your search results, where relevant. Starting from April 16th 2015, App Indexing will start to be used as a ranking signal, with search results performed on android devices including relevant links related to any app, including those that have not been installed.

"If you've implemented App Indexing, when indexed content from your app is relevant to a search done on Google on Android devices, people may start to see app install buttons for your app in search results. Tapping these buttons will take them to the Google Play store where they can install your app, then continue straight on to the right content within it," said a Google spokesperson. The stated aim of this change is to help app developers acquire new users and re-engage existing ones.

April 2015: "Doorway" Update

In March 2015, Google announced that it would begin to take more action against websites that contain "doorway" pages (pages that exist solely to rank in search engine results without any clear, unique value). "We have a long-standing view that doorway pages that are created solely for search engines can harm the quality of the user's search experience," said Google's spokesperson. They went on to say that this ranking adjustment could have a "broad impact" on any websites with a large number of doorway pages. Google confirmed in a Google+ Hangout in April 2015 that this algorithm change had been implemented.

May 2015: "Quality" Update

Following rumors that many websites had seen changes in their rankings early in May 2015, Google confirmed to Search Engine Land that it has changed it's core ranking algorithm in terms of how it processes quality signals - that is, how it determines the quality of a website. Some major websites, including HubPages have suffered a drop in rankings, but Google has emphasized that no particular type of website was targeted by the updated.

October 2015: "RankBrain" Update

In November 2014 we described how Google's search algorithms were adapting to accommodate an increasing trend towards queries that involve "natural language" instead of specific keywords (a practice called "Semantic Search"). Almost a year later, Google has confirmed to Bloomberg that it is now using an artificial intelligence system, nicknamed "RankBrain" to understand the 15% of queries that its systems have never seen before - queries that, for example, contain complex natural language questions such as "What's the title of the consumer at the highest level of a food chain?"

Google has confirmed to SearchEngineLand that RankBrain began rolling out in early 2015, and that it has been having an effect on global search results for several months. Google also confirmed that RankBrain is not just involved in interpreting search queries, but that it also contributes to how highly pages rank for specific searches. Greg Corrado, a senior research scientist at Google, described RankBrain to Bloomberg as now the "third-most important signal contributing to the result of a search query", but for the moment the company are not disclosing the exact nature of this contribution.

January 2016: Core Algorithm Update

From early January 2016, changes to the way that Google ranks low-quality websites (previously known as "Panda" updates) became part of it's "core" algorithm, which can change often and in real-time. A large change in search engine rankings reported by many specialists in the same month was attributed by Google to lots of minor updates to their Core algorithm being released at the same time, since they prefer not to release them over the holiday period.

September 2016: "Possum" Update

In early September, Google released it's most comprehensive update to search results with a localized element since the "Pigeon" update in July 2014. This update has been called "Possum" by SEO professionals because of the way in which many business listings appear to have disappeared from localized search results, when in fact they are being hidden by filters.

Prior to this update, businesses that are located outside physical city limits often found it more difficult to rank for any keywords that included that city name, but now many have seen a significant increase in rankings. As a result, other businesses have seen their rankings drop. Another aim of this update seems to be an attempt to filter out more business listings that share the same address and category. For example, where multiple dentists share the same physical address, Google's localized search results will now show fewer of these in the same set of results.

There are also reports that this update has applied more importance on the searcher's actual location when it comes to the results returned. For example, people searching for "pizza in London" who are physically located in London will see different results to those who perform the search from New York. However, this and other aspects of the change still seem to be subject to fluctuation, as if Google has not yet finalized this update.

February 2017: Groundhog

It is believed that this update was released to specifically target a widely-used linking scheme in private blog networks. Such networks are usually all owned by the same entity and are designed to increase the rankings of member websites.

March 2017: Fred

An update to Google's algorithms took place in early March 2017 but was not confirmed by Google until March 23rd. This update was given it's name after a Google official joked about how all updates should be called "Fred". According to reports from SearchEngineLand, most websites that suffered lower rankings after this update were engaged in so-called "black-hat" methods and primarily contained content that was spammy, irrelevant, duplicate, low quality or outdated. According to Site Pro News, websites affected by this update suffered a 50%-90% decrease in traffic from Google's search results.

Conclusion

Generally speaking, if your website uses legitimate SEO techniques and you avoid using "Black-Hat" SEO methods, your website should not have been negatively affected by any of Google's algorithm updates. However, if your website experienced any significant drop in traffic that coincided with the dates of these updates, it is time to re-examine your website structure, content and SEO methods.

Last updated: 16th August, 2017