$2,000 Free Audit

Google SEO History: Google Algorithm Updates

Let’s face it, life without Google would be a tough feat; there’s no denying we all rely on it more than we care to admit, even turning our search behaviour into a verb: “Just Google it.”

The truth is, there’s so much that goes into making this search giant tick, packed with complicated technology, artificial intelligence and rules. And without all of this, we wouldn’t get the results and experience that we expect from this everyday internet essential. 

So what exactly drives Google? Well, there’s a lot that goes with that question, but the core element surrounding the platform is its very own algorithms, filled with mystery and more changes than we can count on our fingers and toes. (Or even all of ours combined).

In this in-depth guide, we explore everything to do with Google’s algorithm changes:

Grab a cup of coffee, this is a big one. 

 

What is a Google algorithm?

In essence, Google’s main goal is to serve you with quality search results. If you weren’t able to find the answer to your most burning question that you’ve gone to good, old Google for, chances are you’d grow frustrated pretty quickly. 

So, to avoid this, Google has developed a large number of algorithms that play different roles in providing relevant search results to users. These algorithms are constantly evolving and being updated so that they can provide better quality outcomes, ensuring that no ball is dropped as the digital world continues to expand. 

In conjunction with this, remember that there are more than 1.6 billion websites on the net. That’s a lot. In order for you to get the most relevant information, Google has to sift through all of this to understand which of these sites are most likely to solve your query. Enter those nifty algorithms. 

These algorithms work by analyzing a variety of elements that form a website and its functionalities - ones that you may not even know play a crucial role in how you’re seen online. More on this further below. 

All of those factors then go into calculating a score for every web page in existence, which is then used by the algorithm to determine how high that site should rank in a user's search results. If its score is low, it’s likely the page will be buried in the shadows, while high-scoring ones have more of a chance of being seen in the top search engine results pages – known as SERPs.

While these algorithms are complex and constantly changing, they essentially boil down to two basic components: relevance and authority. But it’s all of the nitty-gritty, technical elements that form these two components that get far more granular. 

So to start off with the basics, let’s define these two at their very core.

  • The relevancy component helps determine how well your content relates to the user's query; for example, if you were searching for "pizza" you would not expect to find information about computers in response to that query. 

  • The authority component takes into account how trustworthy Google believes your website is in relation to the information in its index; this can be influenced by factors such as PageRank, incoming links, and general credibility.

The transformation of the modern search experience

The modern search experience has changed dramatically over the years. Back in the “old days”, you’d type in a keyword and get a list of links to choose from. The value of these links’ relevance was based on an algorithm known as PageRank  – a formula Google conjured up back in the late 90s.

Fast-forward to 2019, and this official patent was then made redundant, bringing in newer, more intuitive algorithms instead. What we see today in 2022 is far more complex, with an endless list of factors attributed to whether a certain URL is relevant and credible enough to show up in this list of results to a user.

google algorithm

Source: Wikipedia

 

In 1998, then United States Senator Ted Stevens said: "The internet is not a big truck. It's a series of tubes." Turns out he was right — the internet is made up of hyperlinks and web pages connected to one another, creating a huge ‘web’ of information to sort through. When there’s so much of this connectivity going on, creating a hierarchy is crucial – which is exactly what Google continues to do with its many algorithm updates. 

The process of collecting, organizing and delivering these links and pages to people who type keywords into a search bar is what we call ‘crawling’ (Google uses the term ‘spidering’), with PageRank originally responsible for then creating a hierarchy to these results. Now, more modern algorithms do the same job, but in a far more granular and complex way. There’s no getting around them, and you can’t pull the wool over their eyes, either.

As time passed, Google continued to tweak its algorithms, because while its original PageRank was a great way to measure link authority between web pages, it wasn't useful for measuring relevance within individual URLs. For example, if you typed "coffee mug" into Google and asked it to show you coffee mugs for sale on Amazon, Google would, once upon a time, struggle to show you these separate results for coffee mugs only on the Amazon site.

A lot has changed since then. 

Where machine learning meets a google algorithm update

Today, Google's search results are largely determined by machine-learned algorithms. However, there are many human factors involved in the process, making it more advanced and hands-on than ever before. Errors in these formulas are almost non-existent, thanks to the huge amount of work put into the creation of these algorithms and their technology.

Google's search algorithm is constantly being updated to keep up with the changing nature of information and the web, combined with user behaviour. It has been tweaked over time to fix a variety of problems, including search results that are not relevant or useful (i.e. PageRank concerns), issues with quality and spam, and concerns about copyright protection and content ownership. As these matters pop up and evolve, Google adapts its algorithms to suit the nature of the times.

In terms of being set in stone, Google's algorithm particulars are never fully disclosed, but it is safe to say that it is a constantly evolving process. It uses more than 200 signals or "clues" to determine the rankings of websites in their results. 

These factors include:

  • Content and words on the page

  • Links pointing back to the site (backlinks)

  • How much time people spend on a page

  • Location of users browsing a site (based on their IP addresses). 

  • Personal history of previous searches, which allows Google to predict what someone might search for next and adjust their results accordingly. You know — those almost-eerie suggestions that pop up in your search bar.

Note that these are only a handful of the many, many factors that contribute to your rankings success. (Remember, 200 of them.)

 

google ranking algorithmSource

It’s important to note that human intervention is also a welcomed factor in current algorithms. For example, Google now allows anyone to audit or even contribute to this technology using Google Webmaster Tools (GWT), which allows site owners to submit their pages for review by the search engine. This was never a thing back in the past, where the search engine was hugely secretive about its ways and kept mostly to itself.

To show that Google has really accepted the future and had a change of heart, you only need to look at their ‘How Search Works’ piece. For example, it clearly and publicly discloses that their ranking factors are now based on a foundation of:

  • Meaning of your query

  • Relevance of web pages 

  • Quality of content

  • Usability of web pages

  • Context and settings

In the next section, we’ll further examine and discuss how each of these elements work, ultimately dictating whether your site is seen in search results or not. 

 

How search algorithms work 

There’s a lot of information out there on the web, and what you know about and see is only a tiny margin of what’s really floating around.

With 60% of the world’s population now online each and every day, all contributing information to this massive space, it's no surprise that the amount of data is extraordinarily huge in volume.

And with that volume comes the need to create order, allowing users to discover what’s really important to their biggest problems.

Enter search indexing. 

In their simplest form, ranking systems are made up of a whole series of algorithms which create order to the information we seek online. These days, the freshness and quality of content plays a big role in answering queries about the latest trends and topics, rather than just being active overall. 

But to ensure that these algorithms continue to meet needs of relevance and quality, Google employs a stringent process that goes through continuous live tests and endless assessments from its teams. In actual fact, it uses a trained external body called ‘Search Quality Raters’ that span across the globe, all assessing the validity of its algorithms and whether guidelines are being followed. You can even few these guidelines, as they’re publicly available for all to see.

In the above section, we mentioned that Google has made an effort to publicly state what makes you rank. Now, we’ll delve into each of these factors on a deeper level, helping you to understand how this may apply to the performance of your own website. 

 

The meaning of your query

To ensure users receive relevant results to their query, it’s important that Google understands what information is being asked for in the first place – AKA, the intent. As it stands, search intent is a massive part of ranking systems in the modern search world, and ultimately dictates how language is understood by crawlers. 

Google works hard to create language models that adapt and evolve as dialogue does. Remember that the way we speak in everyday conversation changes over time, so search models need to reflect this, too. Therefore, Google creates these models to decipher what "strings of words'' need to be recalled from its massive index database (websites it has crawled and stowed away – like a big library).

The process behind this involves interpreting many complications in search queries, including spelling mistakes and even understanding how the latest research in natural language might reflect the intent of your search. 

For example, Google’s synonym system ensures that it can understand that multiple words mean the same thing. While it seems basic in nature, the system itself took more than five years to create, and now significantly improves the results returned to users across a multitude of languages. 

Aside from synonyms, algorithm need to decipher what category of information you’re looking for as a user – e.g. Are you looking for something specific or just broad information? Are you looking for a review, simple trading hours or some images? All of this dictates the results returned. And then there’s the language factor: are you wanting that in Italian? French, maybe? How about English?

Suddenly it’s a whole lot complicated than we first though, hey?

A big part of this form of categorisation comes down to the search giant’s ability to assess whether you (as the user) are looking for extremely fresh content or something that’s more evergreen. It’s for this reason that it has its own algorithm on ‘freshness’, which dictates whether a website offers timely, relevant content that can provide value to users overall. As an example, let’s say you were looking up current scores in real-time for a football match – you’d expect that these results are accurate as of the time the results are returned to you. It’s Google’s job to make sure that happens.

 

Relevance of web pages

Algorithms also need to examine whether content on specific URLs contains information that is relevant enough to answer user questions. That means the process needs to signify whether information and data collected on the actual site itself explains in the most value-driven way what the user wants to know – and that takes keywords. 

Yes, this is where search engine optimization (SEO) tactics come in. By mentioning specific high-volume keywords on your website, Google picks up that you carry relevance to the user’s query, and creates a connection between the two. 

However, it's important to note that spamming keywords throughout your content, with the rest being off-topic, won’t get you anywhere fast. In fact, you’ll only land in hot water – Google is far smarter than it used to be, and can sense when you’re trying to take short cuts; it now looks at the entirety of your content, not just these primary terms.

Once Google’s spiders have crawled through a website’s headings and content to see if keywords match up, it looks closer at whether all of the information is relevant. From there, it lets its systems know that search results are relevant and should be tied to the query, using complicated data transformation to create this affiliation. 

To put it simply, let’s assume you’re looking for a new car. Entering the search query “car” into Google will give you a lot of broad information, so you want to narrow down your search. You wander onto a page and want to see whether the value is there enough to answer your question on the "best cars for 2022". 

What you’re met with is a whole site spammed with the word “car” and nothing much else. Your question is not answered, in fact, they’ve just tried to get into results by plaguing their on-page content with this word. 

On another site, however, you find a list of current popular makes and models, their features and pros and cons. The page is well-structured, has keywords in its headings and even external links to official sites. Your question is answered and you leave your experience feeling satisfied. Google's job is done, and so is the website you visited.

See the difference?

Google picks up on whether a site creates more context and value in its content than just the presence of the keyword itself. Back in the ‘olden days’, the keyword was enough, but now it’s a combination of content marketing with search engine optimisation and the ability to create a positive user experience.

A note on poor-value sites that rank

In terms of how the first site example ranked in the first place it’s likely that they won’t remain in these SERPs for long. If a site is able to be seen in the top 10 positions on Google for a keyword, but has nothing more than a spammed keyword and thin content, chances are it took some shortcuts to get there. It will not stay there for long. 

This is where the danger of hearing so-called ‘experts’ claim they can rank you on the first page in ‘X’ days is a red flag. They can, and they probably will. But it never lasts.

 

Quality of content

Going hand in hand with the above, algorithms need to pick up whether the source itself is reliable. While it might have great quality content upon first glance, whether the facts are correct is a whole other ball game.

To ensure it delivers the most accurate sites for a search query, Google uses systems that identify signals that are able to dictate whether a page shows:

  1. expertise;

  2. authoritativeness, and

  3. trustworthiness.

If a site continues to gain attention from users for the same (or similar) query, chances are it’s doing its job and is trusted. For example, people often rely on Wikipedia for answers to a whole range of things, and it’s known to be the go-to source for almost anything in the world. 

Google then uses aggregated feedback to understand whether the quality of information given on a page is where it needs to be. 

Beyond this, its intricate algorithm on spam plays a huge role in understanding whether a page is of low or high quality. It places a big emphasis on gatekeeping spam content and ensuring these sites don’t end up in search results by using deceptive tactics – otherwise known as ‘black-hat SEO’.

This is where Google’s webmaster guidelines is hugely important, and is well-worth reading up on when you have the time. These guidelines set the benchmark for what is deemed ‘quality’ and what is ‘deceptive or manipulative’. 

 

In addition to this, techniques like purchasing low-quality links to ‘boost’ rankings, hiding content on your website (such as white text on a white background) and even copying information word for word from another URL, can all play a role in whether your site is defined as spam or  of poor quality. 

We don’t need to remind you which end of the stick you need to be on. 

 

Usability of web pages

As it ranks sites, Google assesses whether specific URLs are easy to navigate through and use. If there are pain points that hinder the user experience, that site will notice a drop in their rankings performance. 

In a nutshell, the search engine creates algorithms that promote “more usable pages over less usable ones”, including whether the site shows up correctly in various browsers. Even elements like device types, responsiveness and page load times all dictate whether a site will show up in search results. 

The good news is that any site owner can improve all of these elements on their own platform, or enlist the help of an SEO agency that can do it for them. The bad news is that none of this can be given the blind eye and must be attended to in order to rank well.

Pro tip: If you’re not sure if your website is loading fast enough, sites like PageSpeed Insights can help you benchmark performance.

 

Context and settings

This is where – as a user – you’re most likely to say “So THAT’S how I get creepy search suggestions”. 

Google uses details like your location, previous browsing history and search settings to deliver you the most relevant results and useful information. If you’re located in Florida, it will give you results that are tailored to this location. If your search settings dictate that you speak Italian and live in Florida, it will match the results to suit your needs. Nifty, huh?

Additionally, if you have your own Google account (like Gmail), it will also use information collected through this platform to further tweak the delivery of your returned results. Whether you deem this as convenient or ‘creepy’ is entirely up to you.

Note: You can actually control what kind of activity Google collects for your search experience by clicking here. Simply disable ‘Web & App Activity’ if you’d like to stop disclosing activity details to the company. Keep in mind that your search results may be less accurate, though.

Additional search factors to consider

While these categories form the primary considerations worth keeping in mind when you’re trying to rank your website, there are a few additional details that you’ll need to optimize. These include:

  • Mobile-friendliness: Whether your site is well-optimized for mobile devices – from smartphones to tablets of all kinds. 

  • HTTPS: Is your site secure and backed by a HTTPS status and SSL certificate?

  • Low bounce rates: Do your users stick around on your page or do they bounce off as soon as they load your website? 

How many times does Google update its search algorithms?

This is one of those questions that experts will weigh up on and argue over, but it’s estimated that Google changes its algorithms approximately 500 to 600 times every year. That equates to around one or two changes a day.

Most of these amendments don’t play a big role on how your SEO performs, with only larger algorithms (like Panda or Penguin) really changing up the game. 

If you’re interested in seeing a timeline of how many changes have occurred over the years, Moz maps it out nicely in the below list. Note how recent years have seen thousands of updates, while previous ones only saw a few hundred.

Note that ‘launches’ refers to a collection of algorithm ‘changes’, while ‘changes’ are individual tweaks, and improvements are minor fixes.

Confusing? It’s not, really. Here’s what Google’s Matt Cutts said in 2009:

"We might batch [algorithm changes] up and go to a meeting once a week where we talk about 8 or 10 or 12 or 6 different things that we would want to launch, but then after those get approved ... those will roll out as we can get them into production."

google algorithm updates

Source

 

How many of these SEO algorithm updates matter?

If you’re overwhelmed, never fear – a majority of Google’s algorithm changes are minor and only adjust things like UI. That’s why they’re known as ‘improvements’ or ‘algorithm updates’.

But as time goes on, user behaviour shifts and SERPs become more congratulated, so algorithms need to evolve accordingly. This is where bigger algorithms are brought to fruition.

As an example, think about the spark of local SEO. Businesses started honing in on audiences in their local vicinity, using this hyper-targeted form of optimisation to reel more qualified website traffic in. Google’s local algorithm chooses to prioritise relevance, proximity and prominence to rank websites in this category. 

Note that recently in December, 2021, Google updated its local algorithm to focus more on proximity than it had before, making this a massive ranking factor for those targeting specific postcodes. 

Similarly, 2015’s Mobilegeddon algorithm update shifted the search world entirely, making it a crucial point for businesses to ensure their site was optimised for mobile devices. Today, this is still one of the most defining ranking factors of all. 

Whether an update is important to your business also depends on your relationship with the algorithm’s finer details and whether it affects your users’ demands. For example, if your brand is focused on speaking French and an update refines how synonyms are picked up in this language, then this update is a significant one for you to focus on. But for general brands in the United States, it’s barely something to bat an eye at.

A good rule of thumb is to consistently check over your metrics – via Google Analytics, SEMrush, Ahrefs or the like – and monitor whether your rankings are fluctuating. If they take a massive hit, an algorithm change has likely slapped you hard on the wrist, and you’ll need to investigate why and what has happened. More on this later.

 

How to keep up with Google algorithm changes

Staying up to date with Google's changes is possible in several ways. You can monitor your site's web traffic and search rankings for your target keywords regularly with your analytics platform. For the most part, Google has also gotten better at outlining updates they’ve rolled out by publicly disclosing the details on their website. 

Pro tip: Follow Google Search Liaison on Twitter for real-time notifications on core updates. We also recommend using Google Alerts, which you can easily set up. (And on absolutely anything you want to see news for, too.)

When they implement bigger changes, though, you’ll need to exorcise your own investigations and analysis to assess whether the update has affected you and how. As noted before, looking at your metrics is what you’ll need to do here, examining where and when your rankings plummeted and for what keywords/pages. 

If you feel like you've been slapped with an algorithm fluctuation, here are the first best steps to take:

  1. Open up your Google Analytics and look for immediate changes that coincide with an algorithm roll-out. 

  2. Look at your traffic and rankings in particular and see whether they’ve dropped or increased. (Yes, it’s not always bad news.)

  3. Use an analytics tool designed for algorithm changes, like Accuranker’s Grump

  4. Open your Google Search Central platform and diagnose common performance problems. If you have some red flags popping up and you’ve noticed a decline in campaign performance, it’s likely the algorithm has pinpointed these aspects in particular.

  5. Head to your Search Console (Google’s free analytics tool) to l

  6. Create a checklist of issues you’ve found during these steps and look toward fixing them as soon as you can.

Easy ways to optimize your site for Google’s algorithms

  1. It’s hard to pinpoint an exact method to create SEO success, but the good news is there are always some sure-fire ways to get on the right track. If you haven’t optimized your site yet, a few of our key recommendations for avoiding penalties from Google algorithm updates include:

  2. Optimize your site for mobile-friendliness. Use Mobile-Friendly Test to see whether your site is where it should be in this aspect.

  3. Assess your internal links. Ensure there are no broken links hindering your site performance and user experience and double-check they are linking to relevant content that’s up-to-date and accurate. The better quality the links, the better your rankings.

  4. Aim to improve user engagement. High website traffic is one thing but getting high bounce rates will only plummet your rankings. Try to identify why your users are immediately leaving your site when they visit.

  5. Fix up load times: If your pages load slow, this is a big warning sign for user experience, conversions and rankings. Try to fix up anything holding your load speed back by using a tool like PageSpeed Insights.

  6. Eliminate duplicated content: Even a few sentences of copied content can influence your search performance. Use a tool like Screaming Frog or Siteliner to see where duplicate content may be causing issues, and then tidy it all up by turning it into unique material.

  7. Work on your content strategy: If someone heads to your site, they want answers. If you’re getting traffic, Google is rewarding you for having quality-focused content on your pages in the first place. If your rankings are poor, it’s likely this isn’t the case, so work on building out value-driven content that answers the biggest questions in your space. 

  8. Stay away from keyword spamming: Nothing is more frustrating to Google crawlers and users than keywords blanketing a page, throughout the content, design and headings. Try to use your phrases naturally and in the most organic way possible, rather than forcing them in for the sake of SEO.

  9. Never over-optimize: While it can seem like a good idea to go all-out on keyword optimization and SEO tid-bits, try to avoid going too far. Less is more, in some cases. If it looks like there’s too much going on, reel it back.

  10. Improve navigability: Users need to flow through your site easily and so do Google’s spiders. Always keep user experience in mind when building out menus, site maps and links. These all need to connect seamlessly. If something’s broken, fix it.

  11. Opt for HTTPS: Buying an SSL certificate is essential, not a nice-to-have. Your site host will give you the option to snatch one of these up, and we recommend that it’s actually the first thing you do, if you haven’t got one already. In essence, it lets Google (and users) know that your website is safe, secure and trustworthy.

A timeline of Google’s core algorithm updates

Our guide wouldn’t be complete without an interesting look back at the biggest changes to hit the search world, courtesy of the algorithm wizards at Google. Here are the primary updates to date, and the most important ones for you to assess and keep an eye on, even all these years later.

This timeline is mapped out from the latest Google algorithm update to the oldest: 

Product Reviews Update — December 1, 2021

From December 1, Google's Product Review Update targeted to reward high-quality product reviews (and a refresh of the April 2021 update). This update was reportedly implemented over three weeks. 

November 2021 Core Update — November 17, 2021

Google announced the rollout of a core update in November. Officially lasting for around two weeks, most tracking sites showed a strong volatility spike on November 17. The overlap with the Black Friday sale season sparked some debate in the SEO community.

November 2021 Spam Update — November 3, 2021

The was another "Link Spam Update" in November, with Google confirming the 8 day rollout however not much detail was released on the sites and tactics targeted.

Link Spam Update - July 26, 2021

The July 2021 Core Update was a big attempt by Google to make spammy links across the web redundant, even in multiple languages. To date, the search giant has always preached high-quality content and user-driven material, instead of buying or using dodgy link schemes. Those with authentic, naturally gained backlinks saw a rise in ranks here.

 

Broad Core Algorithm Update – July 2021 

Between June and July, 2021, Google rolled out two big updates one after the other. The last core update before this was all the way back in December 2020, so this was a primary one for the year. 

Just like most core updates, no one thing was targeted, rather Google updated the entirety of its algorithm with smaller tweaks.

 

Broad Core Algorithm Update – June 2021

The first of the duo updates, this one was also broad and far-reaching, touching bases on multiple aspects of search performance.

 

Page Experience Update – May 2021

In the books of Google, this one has been called ‘revolutionary’. This roll-out saw Google officially define user experience as part of their ranking formula. Sites had poor Core Web Vitals took a hit, particularly those with slow load times, annoying advertising and distracting content.

 

Passage Ranking – February 2021

Another big one, this update initially said the change could be responsible for site ratings by up to 7%. Alongside its passage indexing method, Google implemented capabilities for its bots to recall information buried in a pile of ‘web passages’ and use it to deliver an answer for a short search query. This is where semantics really started to become important and it still plays a massive role in search performance today.

 

Broad Core Algorithm Update – December 2020

A larger one of its kind, this one was again wide-reaching and influenced all aspects of websites and SEO in multiple languages.

 

BERT – October 2020

This one didn’t directly hit SEO but it is still important to touch on in the history of things. During this period, Google announced that BERT was responsible for processing all search results in English. This wasn’t news in the sense of BERT’s existence – which was announced earlier in October 2019 – but the rise of its capabilities became the key focus. 

Fast-forward to the time of writing this and BERT powers almost all search results in the English language, as well as 70 other languages. For machine learning, that’s a big win. 

Fun fact: BERT stands for Bidirectional Encoder Representations from Transformers.

 

Broad Core Algorithm Update - January 2020

A small fish in the sea, this update didn’t really hit hard for any website, but rather primed the space and made room for its May Core Update instead.

 

BERT Natural Language Processing Update – October 2019

As noted previously, Google launched BERT to cover 1 in 10 search results, using a neural network to comprehend websites and learn as the human brain does. At its most basic level, BERT can understand how natural language works and then relates it back to search queries, paving the way for more value-driven content tactics in SEO.

 

Broad Core Algorithm Update – September 2019

Another broad one, this update was the first of its kind since June 2019 and again looked at almost all aspects of search.

 

Broad Core Algorithm Update – June 2019

The unusual thing about this Core Update was that Google announced it beforehand – which they are known not to do. This core update was aimed at enhancing user experience in a variety of ways.

 

Broad Core Algorithm Update – March 2019

More blurred than other updates, this one shifted the SEO space briefly because it wasn’t allocated a name by Google upon release. After recognizing that marketers were having trouble understanding the nature and intent of the update, Google later backtracked to announce it was called the March 2019 Core Update. 

 

Medic Update (also known as August 2018 Core Update)

Another change that shook the SEO realm, this update fixed major issues with their algorithm to favour websites that hadn’t been rewarded (or even under-rewarded) in the past when they should have been. For businesses, rankings both rose and fell, depending on the nature of their performance and tactics. 

Drops were irreversible.

 

Broad Core Algorithm Update – March 2018 

An all-around and across-the-spectrum update that looked at Google’s query results, this change was just another blanket tweak.

 

Broad Core Algorithm Update - January 2018

A comprehensive update that saw all aspects of a website’s SERPs analysed. 

 

Broad Core Algorithm Update - December 2017

Because this one occurred right before people set off for holidays, it was both unexpected and unwelcome. It had a big impact on mobile SERPs and negatively hit sites that had previously avoided schema markups. Categories hit the hardest were Hobbies & Leisure, Auto & Vehicles, and Science.

Pro tip: You can check your schema validity here.

 

Local August 2017 Hawk Update 

Also known as ‘The Hawk’, this algorithm tweak corrected issues from the previous Possum roll-out (2016). In a nutshell, local sites were being filtered out by Google because their business was deemed as being ‘too close’ to other competitors that already ranked. The Hawk fixed this issue, allowing local brands that competed with another ranking one to also be seen. 

 

Fred Update – March 2017

It’s important to note that this update is still unconfirmed, but the search world speculates that Google released it on the down-low. The SEO community dubbed it ‘Fred’ in response to a joke that a Google employee made that all future updates should be called that.

 

Local Possum Update – September 2016

Affecting local listings, this update was aimed at changing the way the search giant’s filters functioned, eliminating websites that are deemed redundant. As we noted before, this didn’t come without its issues, which were then fixed in The Hawk.

 

RankBrain Google Algorithm – 2016

Across the year, Google rolled out its RankBrain Algorithm, changing up how search results would be processed for the future. This machine-learning system helps the search engine process results more accurately and efficiently, improving its methods as time goes on.

 

Mobile-Friendly Algorithm – May 2016

Not quite as big as ‘Mobilegeddon’, this update boosted rankings for sites that prioritized mobile-friendliness. 

 

Hummingbird – September 2013

Known as the new search platform that Google started using as of September, 2013, this quick and precise system revolutionized how crawlers could interpret the meaning behind words in content. 

Hummingbird looks at each word in a search query to take into account the entire meaning and conversation behind it, rather than individual or specific words in the query itself. It is designed to deploy meaning technology to billions of websites all throughout the web, delivering more accurate and valuable results to users. 

 

Mobile-Friendly Update – April 2015

The primary mobile-friendly update, this change boosted sites that had gone the distance to optimizing their pages for mobile devices. You may also hear it being referred to as ‘Mobilegeddon, and even ‘Mobilepocalypse’.

Pro Tip: Check if your site is mobile-friendly by using this tool.

 

Panda – February 2011

This update saw the roll-out of a new filter to stop poor-quality sites with duplicated content from appearing in search results. Those that managed to slip through loopholes in this regard were hit hard. 

 

Penguin –April 2012

Aimed at improving the way it catches sites that are trying to spam search results, Google penalised those caught buying links or retrieving them through link networks (a big no-no, even today). 

 

Pigeon – July 2014

Focusing on U.S English search results, Pigeon’s algorithm update intended to deliver more relevant and valuable information to a user’s query.

 

Payday – June 2013

Targeted at tidying up results for ‘spammy queries’ – like pornographic content.

 

Pirate – August 2012

A filter introduced to prevent sites with copyright infringements from ranking. This update also caught guilty sites that had previously slipped through the gaps. 

 

Google EMD – September 2012

Known as Exact Match Domain, this filter was launched to prevent poor-quality websites from appearing in ranked search results just because they had the right keywords in their domain names. 

 

Top Heavy – January 2012

Google rolled out this update to prevent sites that were intentionally ‘top-heavy’ with advertising from ranking in its search results. 

 

Key takeaways

  • You can be penalized at any time by an algorithm update, so it’s important to assess your search performance regularly to check why you may have been penalized.

  • Using ‘black hat’ tactics like link spam, duplicate content, hidden content and more can wind you up in an algorithm’s bad books.

  • Google has more than 200 ranking factors that contribute to your campaign performance. These all relate back to relevancy and authority.

  • Most changes are minor, but if you find your rankings have taken a massive dive, a Core Update has likely occurred and you will need to assess why you were influenced in the first place.

  • Use Google Alerts and SearchLiasion to keep track of any potential updates.

  • Read our comprehensive SEO guide to learn more about how the world of search operates.

New Call-to-action

Let's increase your sales.

Claim your $2,000 Audit for FREE by telling us a little about yourself below. No obligations, no catches. Just real, revenue results.