Get Weekly Marketing Tips

Join 30,000+ marketers and get the best marketing tips every week in your inbox

Your website’s traffic has tanked.

You check your analytics, and your organic search has plummeted overnight.

You check Google, and your website doesn’t appear at the top anymore — hang on, no, it’s not even on the first page anymore!

P A N I C.

This is serious.

No traffic = no leads and no sales = no revenue = no business.

Sounds horrid, doesn’t it?

Hopefully, this isn’t the position you find yourself in this morning. But if it is, don’t panic. There’s a cause. We just need to find it.

In this article, I’ll explain exactly what you can do to check why your Google ranking has gone down and what you can do about it if it has.

First, Don’t Panic

Don’t be scared.

You’re (rightfully) concerned about your lost ranking and traffic, but there are blue skies ahead. Ranking drops are rarely permanent. If anything, they’re a great opportunity to take a step back and review your Search Engine Optimisation (SEO) strategy, which was often created years before and followed without any tweaks for the changes Google makes to its algorithm.

With the correct analysis and revised strategy, it’s completely possible to reverse a ranking drop, and by the end of this article, you’ll know a bit more about how to make that happen.

So let’s look at why your Google ranking has dropped.

LIVE Ranking Drop Advice

Watch the replay of our “What To Do If Your Google Ranking Drops” live stream, where I was joined by our Head of SEO, Andy Tuxford.

Why Your Google Ranking Has Dropped

There are three reasons why your keyword rankings have dropped:

  1. Google updated their algorithm
  2. Your competitors have improved their SEO game
  3. An accident has happened

Let’s take a look at each of those in more detail.

Is your marketing underperforming?

Request a free website and marketing review and our team will tell you how to improve your marketing.

Left Icon Right Icon

Google Has Updated Their Algorithm

Google’s search algorithm isn’t just one algorithm; it’s a series of algorithms. Over the years, every SEO consultant with a keyboard has speculated about how many ranking factors there are within it.

Search for “google ranking factors”, and you’ll see more than 25,000 results that mention Google’s (supposed) “200 ranking factors“. But the reality is that between all the algorithms Google has within its “series of algorithms”, the total number of “ranking factors” probably totals in the thousands.

It’s a bit of fun

Whilst it’s impossible to accurately list the thousands of probable algorithm factors, this “SEO Periodic Table” by the Search Engine Land team is quite fun.

A periodic table of SEO factors by Search Engine Land

You can download the full SEO Periodic Table report here.

With that many algorithms and “factors” in play, it’s not surprising that Google is testing and making changes to any number of these daily — sometimes just for a few days, and sometimes only for a handful of countries or languages.

Your website might have been caught up in a test.

If that’s the case, then you might see your ranking return within a week or so.

On the other hand, there could have been a broad core update.

Screenshot of an announcement of a broad core update by Twitter on Google

What’s a Broad Core Update?

A broad core update is a significant update to Google’s algorithm(s).

Google has long made periodic big changes to their algorithm. The two most notable of these were 2011’s Panda update and 2012’s Penguin update. However, since August 2018 (when the “Medic Update” dropped), Google has been referring to their larger algorithm updates as “Broad Core Updates”.

Learn about the Medic Update and E-A-T

In 2018 I did a talk at an exclusive Thinkplus event where I explained what the Medic Update was, how it changed the landscape of SEO, and what E-A-T was (plus, how to optimise for it).

You can watch that talk in full here or you can read about the Medic Update and E-A-T via these two blog posts:

Google is pretty tight-lipped about what they’re trying to achieve with each broad core update, unlike the Panda and Penguin updates, which leaves a big gap for speculation by the SEO community.

Broad core updates have been seen to impact entire industries (as the Medic Update did in 2018) or affect specific elements, such as backlinks, content quality, or the trustworthiness of the publisher or writer of the published content.

These updates tend to drop every three-to-six months, or, as has been the case in the summer of 2021, split into two separate updates spread over two months. They can take up to two weeks to fully roll out.

We’ve seen from broad core updates that, should a website see decreased (or increased) ranking, it’s not uncommon for these ranking changes to be reversed when the next broad core update comes around.

This makes it all the more important that your website is always optimised well, and if you have seen a ranking decrease, you follow the ranking drop checks later on in this article.

If Google hasn’t updated their algorithm(s), then your ranking drop might be a consequence of your competitors stepping up their SEO game.

Image listing Google's multiple algorithm updates in 2021

Image via huckabuy.com

Your Competitors Have Improved

It’s easy to be at the top if nobody else is competing. But if your competitors enter the fight and start investing in their SEO, the top spot is no longer guaranteed.

It was easy to rank at the top of Google for a long while because fewer businesses relied upon Google to earn new leads and sales. But as people have become more digitally native and driven by enforced lockdowns to buy online (the proportion of online spending increased from 19.5% in January 2020 to 35.2% in January 2021), businesses are taking digital marketing more seriously and have taken a digital-first approach.

Screenshot of the Office for National Statistics online retail sales graph

Taken from the Office for National Statistics website

With the right investment and experience-backed SEO strategy, your competitors could significantly improve their search engine rankings within six-to-twelve months.

If you see the same competitor websites appear at the top of Google for a number of your primary search query targets (i.e. keyword phrases), this may be the reason why.

If Google hasn’t updated its algorithm and your competitors aren’t running strong SEO campaigns, then the other main cause of a ranking drop is human error.

Screenshot of SE Ranking's Competitor Rank Tracking tool

You can easily track your competitor’s Google ranking using rank trackers like SE Ranking

An Accident Happened

We all make mistakes. None of us is perfect.

Messi doesn’t score a goal with every strike he takes = not perfect.

Serena Williams doesn’t ace every opening serve (although she does hold the record for the most aces in a tournament) = not perfect.

And I don’t always write industry-defining blog posts (although I do try) = not perfect.

We all make mistakes, and we all have accidents too — and, unfortunately, accidents with a website can make them drop down Google’s rankings.

I’ll cover how to check for some of these accidents later on, but two of the worst you can have are:

  1. Blocking Google from crawling your website
  2. Telling Google not to index pages of your website (or the entire thing, in some instances)

Fortunately, you can reverse these accidents, and rankings (usually) return pretty quickly afterwards.

Screenshot of the Searchmetrics tool showing the ranking of techopedia.com and trustedreviews.com

This graph from SEO tool, Searchmetrics, shows a fantastic example of an overnight dramatic ranking drop

How to Check if an Algorithm Update Has Impacted You

Google updating their algorithm is the most common cause for drops in keyword rankings, so I’ll start by showing you how you can tell if you’ve been impacted by one.

The first thing you should do is check with Google directly.

Get to the top of Google

Learn how to get your website to the very top of Google (and turn that traffic into revenue).

Relevant Description

Check With Google’s Public Liaison

Google introduced a Public Liaison for Search role towards the end of 2017. Their role is to keep the public updated on how search works and how Google continually improves to be more valuable and effective for its users.

The Twitter account for the role, @searchliaison, is always the first official channel to announce when notable algorithm updates have started. If you want to know if a big algorithm update has happened, their account is the first place you should check.

However, there are several unofficial ways algorithm updates are detected and discussed.

Screenshot of Google's Search Liaison's Twitter account

You can follow all of Google’s Search Liaison’s updates on Twitter

The first way is to check Barry Schwartz’s Twitter account.

Barry reports on every algorithm update, even if the initial chatter about them is by website owners and managers on SEO forums.

If you feel your ranking isn’t behaving normally, Barry’s account is an excellent place to check first to see whether an algorithm update has been detected.

Screenshot of a Barry Schwartz tweet about Google's June 2021 broad core update

A typical example of Barry’s algorithm update coverage

The next place, or places, to check are algorithm trackers.

Check Algorithm Trackers

Algorithm trackers monitor Google’s search results for changes in ranking for thousands of keyword phrases. Some even track how many HTTPS addresses are included or whether the design of the results pages has changed in any way.

You can use dozens of trackers to check whether the algorithm is in a state of flux or not, with Semrush’s Sensor being our agency’s preference.

Alternatively, you can try:

Suppose your analytics software is telling you that your website’s organic search traffic has dropped on the same day that one of the above trackers says that Google’s algorithm was in a state of high fluctuation. In that case, it’s pretty safe to say that an algorithm update has impacted your website.

 

Screenshot of Semrush's Sensor algorithm volatility tracker

Semrush’s Sensor

Screenshot of Moz's algorithm volatility tracker

Moz’s Mozcast

Screenshot of RankRanger's algorithm volatility tracker

RankRanger’s Rank Risk Index

Screenshot of Accuranker's algorithm volatility tracker

Accuranker’s Grump Rating

Screenshot of Advanced Web Ranking's algorithm volatility tracker

Advanced Web Ranking’s Algorithm Change Tracker

Screenshot of CognitiveSEO's algorithm volatility tracker

CognitiveSEO’s Signals

Screenshot of the Algoroo algorithm volatility tracker

Dejan Marketing’s Algoroo

How To Check Older Algorithm Updates

You may be looking further back in time within your analytics software, and you spot a decrease in search traffic past the period that the above algorithm trackers track for (typically 30 days). In this case, you can cross-reference the date of the decrease against algorithm history logs kept by several different SEO software providers, consultants, and agencies.

We recommend Marie Haynes’ algorithm history log, which is updated regularly.

Screenshot of Marie Haynes Consulting's google algorithm update history tracker

Marie Haynes Consulting’s Google Algorithm Update History Tracker

Alternatively, you can try (or cross-reference) these other logs:

Struggling to get all your marketing done?

Download our marketing priority planner and get your marketing back on track.

Screenshot of Thinkplus's 90-Minute Marketing Masterplan.

But the truly best way to check whether a historical algorithm update has impacted your website is to use a tool called Panguin Tool (which I adore very, very much).

Panguin Tool overlays your organic traffic against algorithm updates that it records in its database. If you see a drop in organic visits immediately after one of the tool’s indicator lines, then you can safely assume that the update was the cause of the drop.

And, if Google hasn’t been open about what changed in the algorithm update, you can rely upon several SEO analysts and software companies to speculate on what may have happened.

Glenn Gabe has some of the best algorithm analysis available, and the software companies Sistrix and Searchmetrics also regularly produce the best data-based analysis.

Screenshot of Glenn Gabe's algorithm analysis

An example of one of Glenn’s fantastic broad core update analysis

What Should I Do if My Ranking Drops

As with any software update, Google might roll back some of their changes within a few weeks of changing the algorithm. So, if you have had a ranking decrease (or increase), you might see it reversed before any lasting damage is done.

Having said that, since Broad Core updates became a thing, it’s become increasingly more common that ranking changes aren’t reversed until the next Broad Core update (or the one after that).

Every time there’s a new Broad Core update, we read new analyses of websites that have been heavily penalised during a big update six or twelve months prior, which highlights two things:

  1. It can take a long time to reverse ranking drops
  2. The best time to take action to reverse a drop is now

If your website’s ranking has dropped, you need to take immediate action to reverse it.

You may not have done anything “wrong” (i.e. something forbidden by Google’s webmaster guidelines), like Black Hat SEO, but it’s quite probable that Google has detected something they don’t like, and you’ll have to put that right.

You need to review your website and make sure there are no significant issues that could have caused a drop.

Let’s take a look at the twelve things you should check first.

Heading image with the text, "What To Do If Your Google Ranking Drops"

What to Check on Your Website

  1. Is your website indexed?
  2. Is your website crawlable?
  3. Is Googlebot’s IP being blocked?
  4. Are there accidental noindex or nofollow Tags?
  5. Are there erroneous HTTP(S) redirects?
  6. Are there incorrect canonical tags?
  7. Are there incorrect hreflang tags?
  8. Is your website mobile-friendly?
  9. Does your website have crawl errors?
  10. Is your website healthy?
  11. Has your website had a manual action?
  12. Have you lost backlinks?

1. Is Your Website Indexed?

If your ranking has dropped for one search query, it could be symptomatic of all your pages dropping out of the index entirely.

The best way to check is via a site: search in Google.

To do a site: search type site:yourdomainname.com into Google, and the results will either show you all your pages or show you nothing at all.

If you see nothing at all, your website has been pulled from the index.

It usually can take some time for URLs to drop out of Google’s index, but it’s not unheard of for a website to drop out of the index entirely because of some fault on the website.

One of the most common faults is a website incorrectly telling Google not to crawl the website, covered next.

Screenshot of a Google site: search for the Thinkplus website

2. Is Your Website Crawlable?

If Google’s web crawling robot (Googlebot) isn’t allowed to crawl your website, they’re not going to know that you want it listed in their index.

Logical, right?

If your website, intentionally or accidentally, tells Google not to crawl your website, they normally won’t. They tend to respect the rules like that (but, if we’re honest, it’s not guaranteed).

Most websites have a robots.txt file, especially those built with Content Management Systems, like WordPress.

This simple text document tells web crawlers (robots) whether they can crawl a website or not and whether they should ignore any pages on it.

However, whilst this file is incredibly simple, it’s also incredibly sensitive to mistakes.

When edited incorrectly, the robots.txt file can severely impact a website and make entire website sections — or the entire website — disappear from Google’s search results.

We recommend that it’s not “played” with by anyone other than someone who has experience doing so, such as an SEO Specialist (you know, like the ones on our team).

If you absolutely must update it yourself, then I’d recommend reading extensively about it beforehand and that you test it with a robots.txt testing tool, like this fantastic one built by the team at Merkle.

 

Screenshot of Merkle's robots.txt validator and testing tool

Merkle’s robots.txt testing tool is incredibly useful

 

An alternative robots.txt testing tool is Screaming Frog’s SEO Spider web crawling tool (here’s their handy guide).

Here are my favourite guides on understanding how the robots.txt file works:

Underwhelmed by your digital marketing agency?

Learn all the signs that it might be time to change

Front cover of Thinkplus's "Signs It's Time to Choose a New Digital Marketing Agency" guide.

3. Is Googlebot’s IP Being Blocked?

Googlebot visits websites in the same way you and I do. It visits a page through a “browser” (of sorts) and expects to see what you and I see. The only difference is that it runs from a server in a warehouse and not from an office desk.

Primarily, Googlebot runs from servers in the US, but it also runs from servers worldwide. It does this for a specific reason: If your website changes what it shows to a visitor, based on the location they’re visiting from, Googlebot needs to know that and see the difference.

For example, if you owned a website that serves both the US and European markets, you’ll probably show different products and prices to either market based upon the IP address of the shopper. Google needs to see those differences. So they visit your website from a European-based server instead of a North American-based one.

It makes sense, right?

Now let’s say that your website’s server blocks visitors from Europe because of GDPR (ick). Not only would those visitors be blocked, so would Googlebot.

If Googlebot can’t see your European products and prices, then it might choose to do one of two things:

  1. Show the US products and prices instead in the search resultsThat’d be bad for customer experience because people visiting from Europe would most likely see an “I’m sorry, because of GDPR we cannot show you this website” warning page
  2. Not show your website in the search results at all

Google has a very short page about serving “locale-adaptive pages” and how Googlebot crawls them from different locations/IP addresses. It’s worth checking out.

So you really shouldn’t block IP addresses, especially Google’s IP addresses.

If you want to check whether your website is blocking them, you should speak to your developer and possibly your hosting company. They can do checks to see whether IP blocks are in place, including doing Googlebot verification checks.

And if you’re sad, like me, you can even do an IP location check and see where the servers are located.

Screenshot of Googlebot's IP location

Here was can see Googlebot’s location in Kansas whatismyipaddress.com

4. Are There Accidental Noindex or Nofollow Tags?

Noindex and nofollow tags are useful for telling Google when to ignore pages and links on your website, but used incorrectly (or accidentally), can have disastrous consequences for your website.

Noindex tags tell search engines not to list a page (or website) in their index (see Google’s guide).

Nofollow tags tell robots not to follow links on a page to the linked-to page. There are two primary reasons this tag is used:

  1. To stop search engines from finding pages a website owner doesn’t want indexing in the search results
  2. To tell search engines that the linking website doesn’t want to be associated with the linked-to website

These tags are rarely added to a website or page by default. If they’re added, they’re done so with purpose by changing a setting on the backend.

For example, if you wanted to noindex a page using the Rankmath SEO plugin, you’d have to do so manually.

But accidents can happen, and noindex and nofollow tags get added from time to time.

What you need to do is check whether your website has any of these accidentally added. In my opinion and experience as an SEO consultant, the best tool for this is Screaming Frog.

Don’t hire an agency until you’ve read this guide

Find out how the best web design agencies build high-converting websites.

The front cover of "21 Things You Need To Know Before Choosing a Web Design Agency".

Take a quick start lesson in Screaming Frog auditing

Screaming Frog is an essential SEO tool, both for quick checks and for detailed site auditing.

I’d highly recommend learning how to use it, even if you’re using it alongside another site auditing tool (like Sitebulb).

All you need to do is crawl your website and, once it’s done, use the noindex filter to see how many pages are not indexed. You may have some that you added purposefully, but there might be others that you didn’t. If so, look into why they may have been added and remove them.

The same goes for nofollow tags. Ensure that you haven’t accidentally set all of your internal links to your most important pages to nofollow.

It’s worth noting that Google now only uses the nofollow to “hint” as to whether they should or not. That means that they might ignore them and crawl a link anyway. You might get lucky if there are any accidental ones, but it’s better not to leave it to chance.

What’s the difference between Indexable and Non-indexable?

Here’s a brief video by the Screaming Frog team explaining how indexable and non-indexable pages can be found by using their tool:

5. Are There Erroneous HTTP(S) Redirects?

HTTPS stands for Hypertext Transfer Protocol Secure. It’s a fundamental building block of how the internet works.

You might be more used to seeing HTTP than HTTPS, but HTTPS should now be the default every website uses (the S stands for Secure).

The objective of HTTPS is that it passes data more securely between a server and browser than the classic HTTP protocol.

Adding the secure part to the protocol means URLs have to change from http://example.com to https://example.com. When done incorrectly, websites break, or in some cases, result in them experiencing severe ranking problems.

A ranking drop may occur because no redirects are added by the website’s developer that will send people (and robots) from the HTTP address to the HTTPS address. The result of this is that you can access a website via both protocols.

For example, if you go to http://example.com/ you end up on a non-secure page. But if you go to https://example.com/ it’s secure. Same page — one secure and one not.

The problem is that, to Google, those are two different pages. Two different websites.

  1. http://example.com
  2. https://example.com

When you start to include the www. prefix, things get worse. You end up with four pages (or websites).

  1. http://example.com
  2. http://www.example.com
  3. https://example.com
  4. https://www.example.com

So instead of having one website to crawl, Google instead crawls four.

Now imagine your website has 10,000 URLs. Without the redirects in place, Google would need to crawl 40,000 URLs and decide which one is the right one to index.

The last thing we want is for Google to decide which page to index. We want as much control as possible over which pages are appearing in the search results.

So one reason your ranking may have dropped is that Google can’t determine which URL to rank, all because of a few missing redirects.

The best way to check whether you have the right redirects in place is to use an SEO spider tool, like Screaming Frog or Sitebulb, and see whether the URLs are accessible via the different URL prefixes (HTTP or HTTPS and with or without the www.).

In some cases, you might need to create a record of all your URLs, change them in Google Sheets (or Excel, eww), and then upload them into your SEO spider to crawl. It’s laborious, and it takes a bit of time, but sometimes it can be worthwhile.

My preferred starting point is to check the home page URL via a free-to-use online tool called HTTP Status, built by Sander Heilbron. If there’s a problem with the home page, then it’s likely that the rest of the website also has problems.

6. Are There Incorrect Canonical Tags?

A canonical tag is a code that tells search engines that one page is similar or near-identical to another.

Canonical tags are great for when you have multiple pages similar to each other, such as the identical running shoe in multiple colours. They tell Google that the pages are similar, but not copies and that only one of the variants should be listed in their search results.

Without canonicals, Google might presume that all of the pages are duplicates and elect to rank none of them at all.

One historical usage of canonical tags is when websites were previously designed for mobile and hosted on a separate subdomain, such as https://m.example.com.

Another reason that they’re used is to avoid Google wasting resources crawling unimportant pages of your website, such as those created by UTM parameters.

Parameters are very common on some e-commerce platforms, where they’re used to change or filter category and product pages by colour, size, shape, material, etc.

Your website might have dropped down the rankings because canonical tags have been incorrectly added to it.

One example might be that all of your key money pages, such as your service or category pages, have been canonicalised back to your homepage. This would effectively tell Google that none of your other pages are important and that they should rank the homepage instead. Far from ideal.

If I’m searching for a new pair of walking boots, then the best shopping experience isn’t me arriving on your homepage and then having to find the right page hidden within your menu.

The best shopping experience is my searching for “walking boots” in Google, clicking on your website, and then being taken directly to your walking boots category page so I can start browsing.

Just imagine if I could walk into a supermarket right now and be transported from the entrance doorway directly to the right aisle of the supermarket and directly in front of the grocery item I want to buy. Wouldn’t I immediately pick it up and take it to the checkout? You bet I would.

For this reason, you mustn’t incorrectly (or accidentally) canonicalise your most important pages to the wrong URL or your homepage.

Google’s algorithm is getting smarter and has been known to pick the best URL to rank from a range of similar pages without the need for canonical tags. However, every SEO specialist in the world will tell you never to leave Google to decide about anything you can control.

You’ll need to check your canonical tags and Screaming Frog and Sitebulb are the best tools to do so.

Some websites will automatically add and self-reference them on every page (if you check the source code of this page, you’ll see that WordPress has automatically added one), so don’t worry if that’s the case. You need to look out for whether your canonical tags are pointing to another page and then determine if that’s the right thing to do.

Is your marketing underperforming?

Request a free website and marketing review and our team will tell you how to improve your marketing.

Left Icon Right Icon

What are canonical tags?

This video is a great explainer on what canonicals work and how Google uses them by Google’s web genius, John Mueller:

7. Are There Incorrect Hreflang Tags?

Hreflang tags are similar to canonical tags. They tell search engines that there are other near-identical pages to the one being reviewed, but written in a different language.

Hreflang tags are common on international websites, especially eCommerce sites.

They’re simple to add but easy to mess up, like the other tags mentioned above. If these are poorly implemented, they can severely tank a website’s ranking in one country or several countries all at once.

The last thing you need is for your website to suddenly start telling Google that it shouldn’t rank your South American, Australian, or Indian subdirectories anymore. It should only rank your Canadian store instead (which none of the people in those countries can buy from).

Screaming Frog and Sitebulb are the best tools for checking how your hreflang tags are set up, if you have them (or if they have been added accidentally).

If your website is quite large, it might be worth considering either an enterprise-level tool like Deepcrawl or ContentKing, as they’ll audit your website at scale. Especially in ContentKing’s case, they’ll notify you when tags are suddenly added or changed on your website.

How to add, test, and validate your Hreflang tags

This video by former Search Personality of the Year, Aleyda Solis, takes a really deep dive into what hreflang tags are, how to add them, and how to test and validate them:

8. Is Your Website Mobile-friendly?

54.8% of all global internet traffic in Quarter 1 2021 was on a mobile device. For this reason, Google has switched their search results index from desktop-first to mobile-first, which means that they rank websites on their mobile experience and not their desktop one.

If your website is not mobile-friendly, then that might be the reason your rankings have dropped.

There are circumstances where Google will rank a non-mobile-friendly page for a search query if the answer is the best answer available, but where possible they’ll rank a mobile-friendly result instead.

You need to check whether your website is mobile-friendly, and the first place you can check is via your phone. Not every smartphone is the same size, so your experience of your website might differ from mine, but it’s a good starting point.

The best way to check is via Google’s mobile-friendly testing tool. It is more convenient than using your smartphone and it also gives you better feedback on why your page or website isn’t mobile-friendly.

It’s useful for testing a few pages at a time, such as when you have one specific page which has dropped down the search result rankings but less so for when looking at thousands of pages at a time.

If you do have a problem with mobile-friendliness, then you’ll need to speak with a website developer (or our team) to discuss a redesign of your website or migration from any mobile subdomain setup you may have in place.

How to create a profitable mobile-first website

You can hear our founder, Tim Cameron-Kitchen, discuss profitable mobile-first websites in this podcast from 2019.

9. Does Your Website Have Crawl Errors?

First things first. What’s a crawl error?

Well, a crawl error is an error that happens when a web crawler like Googlebot crawls your website. One common error is a 404, which means that a page is missing. But there are hundreds, if not thousands, that can occur on a website — especially those which aren’t maintained regularly.

If you’re new to SEO and don’t really know your 404 from your 301, I’d suggest looking at the Coverage report in your Google Search Console account.

The Coverage report will tell you which pages Google has experienced problems with when crawling your website, but it won’t tell you how to fix them.

Get to know the index coverage report in Google Search Console

This short video by Daniel Waisberg, a Search Advocate at Google, explains how the coverage report within Google Search Console works.

Instead, I’d suggest crawling your website with a different crawler, such as Sitebulb.

I’d pick Sitebulb in particular as it’s built with many “Hints” and explainer documentation that explains why errors can occur and how you can fix them.

Take a peek inside of Sitebulb

We use a lot of site auditing tools here at Thinkplus, each with its own best uses, but all of them essential

Unfortunately, some errors will be caused by the CMS or plugins you’re using on your website, resulting in times when there’ll be little you can do to fix them.

Where possible, you should fix as many as you can, but not every error will result in an increase in page performance or an increase (or regain) in ranking.

To help you understand which errors are critical to your ranking (especially if you’ve had a drop), I’d recommend using Sitebulb’s inbuilt grading system (from Critical through to Low) to prioritise your errors.

Prioritise your errors based on what you believe will improve visitors’ experience to your website (such as increased site speed or mobile-friendliness), but remove any errors that could stop efficient web crawling.

If you’re working with multiple people, including developers, and need help prioritising the work you need to do, I highly recommend this advice from Papier’s Head of SEO, Areej AbuAli, how to prioritise and get Technical SEO done.

Get to the top of Google

Learn how to get your website to the very top of Google (and turn that traffic into revenue).

Relevant Description

How to get your technical SEO fixes implemented

You can watch Areej’s great presentation about prioritising Technical SEO in this video hosted by Jo Turnbull’s TurnDigi online conference.

Thumbnail for the TurnDigi conference video with Areej AbuAli

(Once clicked, video opens in new window)

10. Is Your Website Healthy?

This is a holistic view of your website: Is your website healthy?

Does it have errors? Is it easy to use? Is it fast?

Across hundreds of client SEO campaigns, we’ve always found that the healthier the website, the easier it is to rank it higher in Google’s search results — which means the opposite is true too.

The more broken and unhealthy a website is, the harder it is to rank at the top of Google and the easier it falls down the search results when algorithm updates happen.

Some website crawling and auditing tools have a health score included, typically a 0-100 score.

Now, I don’t necessarily think these scores are the best thing to follow.

Why? Because they lead us to get fixated on having a “perfect” website when we should be focused on creating the best content and website experience possible.

However, suppose you want to give yourself a target to work towards (or you have multiple stakeholders involved in your SEO or marketing). In that case, having a visual goal can be really useful.

Site auditors built within Semrush and SE Ranking have these scores, and both use the same “Errors” (high priority) versus “Warnings” (low priority) flagging system and descriptions of each issue.

Whatever tools or score you’re using, knowing that there are problems on your website is just the first step up the mountain. Getting them fixed is the real big challenge (another good reason to read Areej’s article).

Struggling to get all your marketing done?

Download our marketing priority planner and get your marketing back on track.

Screenshot of Thinkplus's 90-Minute Marketing Masterplan.

11. Has Your Website Had a Manual Action?

Hopefully, manual action is the last thing you should have to worry about, but it’s worth covering, just in case.

A manual action is when someone at Google manually removes a page or website from their search results — when they do so; search visibility and traffic drop off a cliff.

Because of how the algorithm works, Google shouldn’t need to remove anything manually, so you know that when they do have to, it’s because there was something really not cool about the page or website in question.

There are lots of reasons why Google might serve a manual action, including:

  • Your website has too much spammy content
  • There are too many dodgy links pointing at your website
  • There are sneaky redirects on your website
  • There are violations of Google’s usage policy
  • Google are nasty and spiteful

Typically, Google doesn’t just hand out manual actions unless it really thinks a website deserves it, i.e. you’re doing something you shouldn’t be. But, accidents happen, and there might be malicious content on your website that you’re not even aware of, maybe because someone hacked your website.

The solution for manual actions is to resolve the issue(s) that Google flagged and submit your website for review.

Once submitted, reviews can take anywhere from days to weeks to be completed and approved, and for rankings to be restored. In recent years it’s become common for manual action ranking drops not to be reversed until the next broad core update.

You can check your manual action status via this link to your Search Console account. Then start reviewing your website against Google’s Webmaster Guidelines documentation and Google’s Search Quality Evaluator Guidelines (which are used by Google’s testing team).

Reversing a manual action is never a five-minute job. There’ll be a lot of work involved, so ensure that you’re getting the right support from your in-house SEO specialist or SEO agency partner.

12. Have You Lost Backlinks?

As covered before, Google’s algorithm is complicated and multifaceted, but we can pretty much guarantee that backlinks are a big part of it.

With that certainty, we can assume that losing backlinks has a negative impact on ranking.

Link attrition (backlinks getting removed or deleted) is normal. Links die because websites die, get updated, or are redirected to new domains. It’s expected. But if the wrong links are lost (i.e. juicy ones from high-value domains) and not restored (or replaced with equally strong ones), then a core pillar of strong ranking is lost.

Fortunately, almost every backlink tool (like Majestic) records when links are lost so you can automate your detective skills and see when any of your backlinks go missing.

What you need to do next is to either create new links or reclaim those lost links.

Most of the time, links get removed purely by accident. You update an old blog post with some new information and, whilst some copying and pasting is going on, the link gets cut. Whoops.

So your ranking might have dropped because you lost your juicy links. If that’s the case, reclaim them or replace them with newer, stronger links (don’t buy dodgy ones).

How to reclaim your lost backlinks

This video from 2013, whilst quite old in SEO terms, is still as relevant and applicable for reclaiming lost backlinks (and brand mentions) as it was then.

What You Need to Do Next

  1. Check that your content matches search intent
  2. Improve your content
  3. Earn new backlinks
  4. Be patient

1. Check That Your Content Matches Search Intent

When you and I search for something online, we have intent. We want to find the answer to our question or the solution to our problem. For that reason, a business website’s content should match the intent of people searching for their products or services.

There are four different types of search and the intent used for each:

Navigational queries — These queries happen when people want to go somewhere online. For example, “Gmail” when you want to check your email or “Facebook” when you want to see what your aunt had for lunch.

Informational queries — These are research queries. When you want to be smarter with your money, you might search for “what is an ISA?” or “how to sell my kidney for cash”.

Commercial queries — Commercial queries relate to researching how to spend your money. For example, when comparing blenders, you’d probably search for “best blender under £100” or “nutribullet pro 900 review 2021”.

Transactional queries — You use transactional queries when you’re ready to buy the product you’ve researched, like “buy nutribullet 900” or “buy nutribullet 900 free delivery”.

The problems with ranking occur when the content written for a page doesn’t match the intent of the person searching.

For example, if I search for “buy queen size mattress” and the first result is a mattress comparison article, that article doesn’t match my intent. My intent is to buy, not to research about buying.

Knowing the right intent for your content can be tricky, especially as you become incredibly used to your products or services. What you think is the right copy for a page might not be right for the search queries that bring people to that page.

E-commerce product category pages are another example. Because of old SEO myths that “pages need a minimum of 500+ words”, horrendous copy gets written for category pages about the history of the product, who invented it, where in the world the materials are sourced from, and so on.

What people really want to see are the products themselves.

They want to know:

  1. Which products are available
  2. If they come in different colours, shapes, and sizes
  3. If they’re in stock
  4. How much do they cost

Linking to a “How To Choose The Right Running Shoes” guide from a category page would be a much better shopping experience as I’d have the option to use it if it applies to me, whereas if I’ve done my research, all I want to see are the products themselves.

The intent of the copy should match the intent of the search. A good way to find that out is to talk to your customers (both those who haven’t yet converted and those who have).

Your target market will have problems they’re trying to overcome and they’ll know which search queries they’ll have used to bring themselves to your website.

If they used Google to find your website, you’ll be able to use the Performance report in Search Console to see which queries they used to find each page of your website. You can then compare those queries against those URLs and update each page accordingly.

If they didn’t use Google to find you, no problem. Get them on the phone for 15 minutes (in return for a gift card/voucher) and quiz them on how they know about your business and, most importantly, the problems they have or had.

Next up, you’ll have to improve your content.

How to analyse searcher intent

The Ahrefs team have hundreds of great videos about SEO, including this great video about what search intent is.

2. Improve Your Content

It’s not enough to have your content match search intent. Your content also has to better match that search intent than all the other URLs in the search results.

I use a really, really simple process for improving our content. Here’s how I do it:

  1. Search for the search queries I want my article to rank first for
  2. Make a list of the top twenty ranking results
  3. Take note of the recurring themes, headings, and questions in all of them
  4. Reformat all of that info into my existing article outline

That’s it. Super simple.

You look at what’s ranking and make your content an upgrade on the other pages.

The catch is that you don’t make your content longer for the sake of being the most complete and deep. There’s no point writing incredibly intense content that’s 10,000 words long if a simple 200-word article answers the question better.

For example, if I’m writing about “what bounce rate is” I could go into depth. I could talk about what it is, how to optimise for it, how to troubleshoot all the different errors there can be, what a good average bounce rate should be, what the average bounce rate by industry is, and so on.

But if I’m searching for just “what is a good bounce rate” I might not care (yet) about how to troubleshoot it. I might like a link to an article about troubleshooting bad bounce rates that I can read later. But for now, I’m only focused on answering my current question (“what’s a good bounce rate?”).

Your content should be best for the search queries and intent you’re trying to answer. It doesn’t have to be the best content for every possible search query.

Caveat: If you’re writing in the context of the piece, it makes sense to include information about other queries. More than anything else, you want to give people the best experience possible, even if that means ranking second or third and not first.

If you’re looking for something a little more advanced regarding content outlines, this Twitter thread takes a really interesting look at how you can use AI within content research and creation.

Is your marketing underperforming?

Request a free website and marketing review and our team will tell you how to improve your marketing.

Left Icon Right Icon

How to improve your content with SEO in mind

This video by the team at Surfer offers a fantastic insight on how you can improve your content with SEO in mind (whilst still writing for humans first).

3. Earn New backlinks

SEO and link building are never one-and-done activities. Great links built today help your website today, but tomorrow they’ll be old backlinks, and they’ll no longer have their shine.

It’s best to constantly earn new backlinks all the time through great content creation and promotion.

There are lots of great ways to earn new backlinks, and I very much mean earn. You can do outreach and let other website owners know about the new content you’ve created and see if they’d be interested in making it known to their readers, too (by linking to it). That’s perfectly fine. It’s buying low-value backlinks that you need to avoid.

50 Free Ways To Build Backlinks

There are 100s of ways you can earn backlinks for your website, but if you limited for time or money then I recently ran a live stream where I outline fifty different ways you could build new links.

4. Be Patient

I hate to say it, but sometimes the best thing you can do is wait it out.

There might be nothing wrong with your website at all. Google could have updated their algorithm and, through some error, your website got unfairly dropped down the ranking. It can happen.

Sometimes a website might have to wait until the next broad core algorithm update to see a restoration of lost rankings, sometimes up to a year after the drop.

The right action to take is positive and immediate action after a ranking drop.

  • Check everything about your website
  • See what needs improving or replacing entirely
  • Check your onsite SEO and offsite SEO
  • Check your backlinks (and build new ones)

Do every check outlined above and work to improve your content and link building strategy continuously, and you’ll see your rankings improve, guaranteed.

How Long Does It Take to Recover from a Ranking Drop

We’ve seen ranking drops reversed overnight, but this is mostly the case where Googlebot has mistakenly blocked a website or has been accidentally non indexed sitewide.

Historically, we’ve seen websites receive ranking corrections within two weeks of a ranking drop (or increase). Still, since the Broad Core update name was coined in 2018, we’ve found that ranking drop recovery sometimes doesn’t occur until the next Broad Core update (3-6 months later).

So it can take days, weeks, (mostly) months, and (sometimes) years to recover from a ranking drop — but that’s only going to be possible if you fully audit your website and make the improvements necessary to earn that ranking reversal.

The Importance of Diversified Traffic Channels

We adore SEO because we’ve proven time and again that it’s the best source of highly qualified revenue-increasing traffic for businesses.

But, even so, we still strongly advise that all businesses diversify their traffic sources so that, should a ranking drop occur, they still have a steady stream of traffic to convert into leads and sales.

If you’ve not yet diversified your traffic I recommend adding Pay-per-Click Advertising and Social Media Marketing to your digital marketing strategy.

Other channels, like Affiliate Marketing, can be great too, but we’d recommend adding PPC and Social Media to your mix first as well as Email Marketing.

Create a Multi-channel Marketing Strategy in 5 Steps

This video by Tim outlines how you can turn all of your marketing ideas and tasks into an actionable digital marketing strategy.

Still Not Sure Why Your Ranking Dropped?

If you’ve been through all the possible reasons why your Google rankings have dropped, and you’re still not sure why, then the best thing to do is to speak to a qualified SEO expert. Fortunately, our team is full of them.

If you’d like our team to take a look at your website, all you have to do is ask. Submit your website for a review, and we’ll not only take a look at your website’s ranking, but we’ll also highlight how else you can increase your traffic, leads, and sales.

Don’t hesitate. Request a free website and marketing review now.