Blog

How to Recover from a sudden drop in website traffic

22 March 2024
Matt Kohli
Last updated: 20 May 2024
Screenshot 2024 03 28 161733
Download my Traffic Recovery Checklist

Experienced a sudden drop in website traffic? There are many potential causes and it can sometimes be a bit daunting to know where to start.

In my role as an SEO Consultant I’m tasked with this challenge fairly regularly, and despite how often I look at this kind of thing, the causes and fixes are rarely the same. It depends is the SEO go-to conclusion/reason/excuse, although I’ve got to stand up for my colleagues here because they’re not wrong!

The aim of this guide is to give you some clear steps to follow so you can validate and fix traffic drops. Now I’m not saying this will contain solutions for every case, although it should at the very least give you some direction. The steps I typically go through (summarised) are as follows:

  1. Check that the drops are a real
  2. Understand where the drops have occurred 
  3. Identify and resolve the issue(s)
  4. Give the fixes time to work
  5. Test and refine

To help you navigate through this hefty article, I’ve put together a downloadable Website Traffic Recovery Checklist that lays out all – anyway, let’s get started!

1. Check that the website traffic drops are real

The first step is to validate the drops. The list of tools that measure website traffic is endless, with third-party tools such as Semrush and Ahrefs often taking centre stage due to their ease of use and popularity – although my recommendation is to use Google Search Console (GSC) and Google Analytics (GA) as much as possible as they provide you with first-party data.

For this initial check, open up Google Search Console and select the correct property, then open up the performance report:

pasted image 0
Caption goes here

For established websites with a few years of data I would filter the report to a 2 year date range as this allows you to factor in seasonal drops and spikes more accurately. For new sites, just use whatever you have at your disposal – a few months of data can still teach you a lot.

Here’s a GSC screenshot of a traffic nosedive in May 2023:

yl4kGaP69o292lJuZgO 5z8 37Y2c4Ym5lLOInGccggXf55ef6APSm2Uo5 RbwRLt9v62NwQ5NrGSqx3HaYQ1 7qbDTKPvjeFHqTEC4CtTGdZ2sjnh24tBDfhPVJils1cewPQw0K9MQ4nO2YO 3RTLM
How to Recover From a Sudden Drop In Website Traffic 50

This quick check validates that the drop certainly exists and should be addressed as a priority due to the severity. Most drops are less obvious, although you get the idea. If you are lucky then you will see no large drops, and if you are really lucky you may even see growth! If that’s the case, take a screenshot of the report and share it with the relevant people to alleviate their concerns. If not, then keep going…

2. Understand where the sudden drops have occurred

I stick with Google Search Console to understand which keywords and pages are growing, and which are slowing. In situations where big traffic drops are experienced, this is a quick way to see where exactly the drops have occurred. 

Open up your property and select “full report” on the right of the main performance chart:

MJrtMO75OFkyZTuKrPt9fkpW7gJg1PC HNLw7NvDFNBniP4x1GwzOpeRT2sHwZweR4 F57uehofgVfUByB82 v0nsT8AzAt3T3kMHHV J353g yqkchBjrTmhFnPlQkQdHfb9HW8JfSBKqOgmblCnw0
How to Recover From a Sudden Drop In Website Traffic 51

This will open up the performance report. Update the date range to “Compare last 6 months to previous period” and hit “Apply”:

jQSvbUgAOxF5gEhJ51vKBI iFkt7neacoKOCChcTEtZzgfRBjddTeRn67yo4UcGJNMEXOoDCGt3dy3wGNyDAwK2uDX2nDI08sAaefSGTG9C63vsfE 0W1fgso38nHOE9keoDzXv4RtQpcMdmvFnqRSk
How to Recover From a Sudden Drop In Website Traffic 52

This will help you highlight the biggest page and keyword culprits. From here you can investigate the specific page or keyword and try to understand why the drops have occurred:

4
How to Recover From a Sudden Drop In Website Traffic 53

3. Identify the cause(s) – Diagnosing traffic drops

Here comes the long bit… understanding where the drops have come from. As I mentioned, there are many reasons that drops occur. Let’s take a look at some of the common things I check for:

Algorithm updates

Algorithm updates are changes that Google makes to its search engine’s algorithm, which determine the ranking of sites on search engine result pages (SERPs). These updates can have a significant impact on your website traffic and conversions. 

In the early days of SEO, updates were feared by everyone, and often resulted in industry wide volatility. To add insult to injury two of the most famous and impactful updates were named after adorable creatures – Penguin and Panda. 

Whoever made this was clearly not a fan of Panda:

How to Recover From a Sudden Drop In Website Traffic 54

Fast forward to now. The ground isn’t shaking as violently as in the early days. That’s not to say it’s completely stable – Google still rolls out updates, but they tend to fine-tune the algorithm rather than overhaul it.

May 2024 Update – AI Overviews, More Updates, More Sites Being Penalised
It would be tone deaf of me to ignore the fact that many sites in recent months have experienced the full wrath of Google (depicted in the terrifying graphic above). The advice in this section still applies, and you’re definitely not in the ‘minority’ if your site has been hit recently. Wishing you a speedy recovery – Matt

To keep on top of Google algorithm updates I would advise bookmarking the Google Search Status Dashboard and the Semrush Sensor. They both provide details and documentation on recent and upcoming updates. 

To quickly see if an update has affected your site, match up the dates. Check the Google search status dashboard or Semrush sensor for the dates of recent updates, and compare them to when your site had drops. If there’s a correlation, then, the update might be your culprit.

So, you can see that you have been impacted by an update. The next step is to understand why and build a roadmap to recovery. The general guidance is to review your existing content and optimise or remove any pages with repetitive copy, excessive ads, or just a general lack of quality – a common culprit of this can be low-value local landing pages. 

To recover from update hits, you should prune or update low quality pages (as mentioned above) and focus on clearly demonstrating Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T). This is your key for showing users and Google that you are a reliable source in your industry or topic area. 

Google has highlighted several questions website owners should be asking themselves when reviewing existing content. Review their documentation and keep yourself close to any new information that they release – it’s clear that E-E-A-T is only going to become more important, and rightfully so!

Tracking errors

Sudden drops in traffic could be due to problems with your tracking setup. Data inaccuracy can mislead website owners into a false sense of security or a false sense of trouble – it’s always worth keeping on top of this.

An animation that demonstrates how to install the Google site tag for conversion tracking in Google Ads.

Check that the website in question has a Google tag (gtag.js) or Google Tag Manager code snippet on every page, and in the correct place. The gtag.js script should be placed in the <head> section of every page on your website to ensure it captures all user interactions that take place.

  • Browser Inspection: Open your website, right-click and select “Inspect” or “Inspect Element”. Navigate to the “Elements” or “Sources” tab and search (Ctrl + F or Cmd + F) for gtag.js or the Google Tag Manager ID (e.g., GTM-XXXXX).
  • Google Tag Assistant: This is a Chrome extension by Google that helps identify and troubleshoot tag installations. Once installed, it will show which tags are on a page and if they’re working correctly.

The cleanest solution is to add your main Google Analytics Tag with Google Tag Manager. Here are the steps you need to follow:

In GTM, navigate to your account and container, then select “Add a new tag”:

nZvwDAbAS I6i70Q2sV3lqRlRY9N2msxnx1TioZCc2XIcdqHyXs7yQ7iMQ398qxDY1 D1A28bDpEF5eh9JXMLDTaeFoE9gc SvFazxu1dx0h4dPYuHEmmHL0v9N18fQh

Click on “Tag Configuration” and choose Google Analytics: GA4 Configuration:

70HOdMybH r6cxTxMgROBl6hW

Enter your “Measurement ID”:

z9WvdV3YvNIxegIkisX6O MSl0kXiZRt2doGYk2Sx28MmMTE7NtL3PITRimY8doMC3LOFFjJwNXmZ1oVQgGC4 pHyee4kc77T

Typically, you’d want the box checked to “Send a page view event when this configuration loads.”

Choose where you want the tag to fire on your site under the “Triggering” box.

Save your tag and publish your GTM container.

After ensuring the code is on your site, visit your website from a different device or browser. Check the “Real-time report” in Google Analytics 4 to see if your visit is recorded. If it is, the code is likely working:

fDiAXZ5RgwmfS6n qMdwExNdpadz8w8l LVpPkY X95qb6qKyG32CGWBK TcCOXV1xgQHekvl

Robots.txt errors

Robots.txt is a text file placed in the root directory of a website. It instructs search engine crawlers which pages or sections of the site should not be indexed or crawled.

This is what a basic Robots.txt file looks like:

7rPVNfziPIIMpgbYA9XZNwvuIfTX2T8ESYvY2fIiioFaANjDppHs 5rDdwfa0a5SM0PSDDE2t5GAI72O16n7NWG3bWus DJiX1tOvHyLZtyH5AAtOuK0JwfQDJWnFytYzYSlU O hbhqNjV1cpuaBiE

There are a number of robots.txt issues to watch out for. Perhaps the most common when it comes to traffic loss is areas of your site being marked as no-indexed. 

Sometimes developers or website managers may unintentionally leave the robots.txt files unchanged after moving from a development or staging website. Your website may have a Robots.txt file that is configured like the following example. You want to ensure this rule is not present:

gU qtMlrY5r06gWwWjsRjp ie6mRJyaFAkgKc7H2Tymyj962J3NZ4ZH0Bvmspm9AJsnmRy4tUw3Mq8YlRiRjdSh

You’ll need to remove the Disallow rule and resubmit your robots.txt file through Google Search Console and the robots.txt tester. The difference is subtle, although the second example would prevent search engines from crawling any of your content.

As mentioned above, there are plenty of Robots.txt errors to be aware of. Here’s a list to refer back to:

  • Robots.txt Not In The Root Directory: The robots.txt file should be placed in the root directory of your website. If it’s in a subdirectory, search engines will ignore it. To fix this, move the robots.txt file to the root directory.
  • Poor Use Of Wildcards: The robots.txt file supports two wildcard characters: asterisk (*) and dollar sign ($). Misuse of these wildcards can lead to unintended restrictions. To fix this, ensure that wildcards are used correctly.
  • Blocked Scripts And Stylesheets: Blocking crawler access to external JavaScripts and stylesheets can lead to issues, as Googlebot needs access to these files to view your pages correctly. To fix this, ensure that necessary CSS and JavaScript files are accessible.
  • No Sitemap URL: Including the URL of your sitemap in your robots.txt file can help search engine crawlers understand the structure of your site. While not an error, it’s beneficial for SEO purposes.
  • Access To Development Sites: It’s essential to block crawlers from indexing pages under development and to remove the block once the site is live. Forgetting to do so can prevent your site from being indexed correctly.

Redirect errors

Many websites, especially larger ones, use redirects. These are often set up using a .htaccess file or, for WordPress users, with a plugin for convenience. Hubspot has released a guide for the best redirect plugins for exactly this purpose. 

Regardless of how you deploy your redirects, it is of key importance to test them. This is even more important if you’re adding high volumes of redirects at once. 

To understand whether there are broken redirects on a website I use Screaming Frog, a well known website crawler. There are many web crawlers to choose from, although my recommendation is either Screaming Frog or Sitebulb. 

In Screaming Frog, you select “spider” mode and type the URL into the search bar at the top. Once you have configured your crawl you press “start”, and when SF is finished it will give you a list of “issues” to review – this is where you will discover any broken redirects:

C80s7fMX vOmpxQkJ5YqU4iUX2dYUI7Nkp6VIQncln4c4U Ei

If you are adding redirects and would like to check if they work correctly first, you switch SF to list mode (Mode > List), paste the URLs you’re redirecting, and then check their response codes and final destinations:

Fortunately, the latest version of Screaming Frog (if this is your tool of choice) provides a description of the issue at hand and instructions for resolving it. Here are some common things to check with redirects:

  • Review Your Redirects: Examine your .htaccess file and any redirection plugins to verify that your redirects are correctly configured.
  • Remove Unnecessary Redirects: Eliminate any superfluous redirects. For instance, if a redirect points to a page that no longer exists, update it to redirect to a relevant page or remove it altogether.
  • Simplify Redirect Chains: If there are multiple redirects in a chain, aim to simplify the process. Direct the initial URL to the final destination directly.
  • Check for Plugin Conflicts: Occasionally, plugins can clash with your redirection setup. Temporarily deactivate plugins and check if the issue persists.
  • Avoid Using Relative URLs: Instead of using relative URLs, always employ absolute URLs in your redirects. This practice helps prevent unintended chains or loops.
  • Test and Monitor: After implementing changes, rigorously test your website to confirm that the redirects are functioning correctly. Continuously monitor for any potential new redirect issues.
  • Utilise a Caching Plugin: A caching plugin can enhance your site’s performance and diminish the necessity for redirects, thereby improving the overall user experience.

Indexing errors

Using Search Console, open up the Index Coverage Report and check for any URLs that have an Error. These pages are being prevented from appearing in search results due to the affiliated error.

You should always verify indexing errors you see in Google Search Console to be sure, to do this you simply need to type “site:www.page-in-question” into Google Search and you will be able to see if the page is visible in results or not. This is what should appear if it is not-indexed:

pP8TnyZ

These are the most errors in Google Search Console’s page indexing report:

Issue TypeDescription
Server error (5xx)500-level error returned by the server.
Redirect errorIssues like long redirect chains, redirect loops, or bad URLs in the redirect chain.
URL blocked by robots.txtPage blocked by robots.txt, but might still be indexed in rare cases.
URL marked ‘noindex’Page has a ‘noindex’ directive and wasn’t indexed.
Soft 404Page returns a user-friendly “not found” message without a 404 HTTP code.
Blocked due to unauthorised request (401)Page blocked to Googlebot due to authorisation request.
Not found (404)Page returned a 404 error without any explicit request.
Blocked due to access forbidden (403)Googlebot didn’t provide credentials, resulting in this error.
URL blocked due to other 4xx issue4xx error not covered by other issue types.
Blocked by page removal toolPage blocked by a URL removal request.
Crawled – currently not indexedPage crawled but not indexed yet.
Discovered – currently not indexedPage found but not crawled.
Alternate page with proper canonical tagPage is an alternate to another indexed page.
Duplicate issuesPages that are duplicates of others and issues related to canonical tags.
Page with redirectNon-canonical URL redirecting to another page.
WarningIssues that don’t prevent indexing but reduce understanding of pages.
Indexed, though blocked by robots.txtPage indexed despite being blocked by robots.txt.
Page indexed without contentPage indexed but content couldn’t be read.

Due to the sheer volume of errors in the page indexing report, I feel that detailing every potential solution could overwhelm this article, sorry! For more insights and guidance on how to tackle each issue, I instead recommend you review Google’s Documentation.

Sitemap errors

A sitemap is a file that provides a list of the pages on a website, organised hierarchically. It’s used to inform search engines about the structure of the website, ensuring that they can find and index all of the pages efficiently. Issues with your sitemap(s) can result in large traffic drops.

To see the status of your sitemaps, go to the “Sitemaps Report” in Google Search Console:

1pJB2faPkbL P6tHdNRRrN1H8vl9lx8GyzquHJ8Q4o3kH ekIYzwVZxLgNcUmZZPFQeCk BMiTtZ946IurRAvyyLaN0n y8xyMDSHjmxxdRYHrbnT79Dqb9 1Zm1C AUA6FrAMD72ufB rg C8PMieA

Here are the main Sitemap errors to be aware of, according to Google’s documentation:

Sitemap Fetch Errors:

  • Google can’t retrieve the sitemap.
  • Blocked by robots.txt.
  • Site has a manual action.
  • Incorrect sitemap URL.
  • General errors preventing retrieval.

Sitemap Parsing Errors:

  • Inaccessible URLs.
  • URLs not followed.
  • URLs not allowed due to hierarchy or domain differences.

Sitemap Specific Errors:

  • Compression issues.
  • Empty sitemap.
  • Sitemap exceeds size limit.
  • Invalid attributes, dates, tags, or URLs.
  • Path mismatch with “www”.
  • Incorrect namespace.
  • Leading whitespace.
  • HTTP errors.
  • Video thumbnail size issues.
  • Video URL discrepancies.
  • Excessive news URLs.
  • Missing <publication> tag for news.
  • Blocked by robots.txt.

For Fetch Errors:

  • Ensure the sitemap is not blocked in robots.txt.
  • Resolve any manual actions on the site.
  • Verify and correct the sitemap URL.
  • Check for server availability or other technical issues.

For Parsing Errors:

  • Ensure all URLs are accessible.
  • Avoid excessive redirects and use absolute links.
  • Ensure sitemap URLs match the domain and hierarchy of the sitemap location.

For Specific Errors:

  • Recompress the sitemap using tools like gzip.
  • Ensure the sitemap contains URLs.
  • Split large sitemaps or use a sitemap index.
  • Correct any invalid entries or attributes.
  • Match the “www” in the sitemap path and URLs.
  • Use the correct namespace for the sitemap type.
  • Remove leading whitespace.
  • Ensure the sitemap URL is correct and accessible.
  • Resize video thumbnails to 160 x 120 px.
  • Differentiate between video content and player URLs.
  • Limit news URLs to the specified maximum.
  • Include a <publication> tag for each news URL.
  • Modify robots.txt to allow Googlebot access.

Manual actions

Manual actions can considerably impact your website traffic and should be investigated as a priority if you notice that you have one!

Manual actions are when a human reviewer determines that a site’s pages violate Google’s spam policies. These are not algorithmic actions but are manually imposed penalties on sites that don’t adhere to Google’s quality guidelines.

Their purpose is to ensure the integrity of search results. Since the inception of search engines, some individuals have tried to use black hat or spammy tactics to mislead the algorithm. Manual actions aim to combat this, to ensure users find accurate answers and legitimate sites gain proper visibility.

Open Search Console, select your property, and open the Manual actions report, you will see the following:

Io8VJDVfOv799v5SO0QxXR72KbRC2Ue2NNSpM557reHGJ PMHDAa2wriEAtuPQcnOhW5IOYCGDv6p19wO RAo8QNAZKpGoznT

As you can see in this example, no issues have been reported, meaning there is no required fix. Although, if you see an error, you need to investigate further. 

  • Review the Issue: Understand the specific reason for the manual action by reviewing the notification from Google. It will provide details on the violation.
  • Address the Issue: Make the necessary changes to your site to rectify the violation. This could involve removing spammy content, fixing technical issues, or adhering to Google’s guidelines.
  • Request a Review: After making the necessary changes, submit a reconsideration request to Google. Provide a detailed explanation of the changes made to address the issue.

Regularly check the Manual Actions report and stay updated with Google’s guidelines to prevent and manage manual actions.

Keyword Cannibalisation

Keyword Cannibalisation is where multiple pieces of content on a website inadvertently compete against each other for the same keyword or search term. This internal conflict can lead to unpredictable fluctuations on search engine results pages (SERPs). 

For instance, a piece of content that previously secured a top spot on the first page of search results might suddenly drop in rankings. While Google doesn’t impose penalties for duplicate content, it has grown increasingly sensitive to content of a similar nature. 

The outcome of this sensitivity is a potential decline in website traffic, as the search engine struggles to determine which piece of content is the most relevant for a given search query.

Monitor SERPs for your targeted keywords. If you notice that multiple URLs from your website appear and fluctuate over time for the same keyword, it’s a clear sign of potential cannibalisation.

Use the “site operator” tool. By inputting the site:www.yoursite.com intitle:”your targeted keyword”, you can identify all pages from your site that Google has indexed with that keyword in the title. I ran this check for Yoast, using the keyword “readability score”:

GKg7

As you can see there are a number of pages targeting the same topic, which could be having an impact on traffic performance.

Preventing and rectifying cannibalisation requires a strategic approach:

  • Content Audit: Before introducing new content, assess your existing content. Determine if there’s any current content that might conflict with the new addition.
  • Merge Similar Content: If older content closely resembles new content, consider merging them. This not only consolidates the information but also prevents internal competition. Ensure you set up a 301 redirect from the old content to the new merged content.
  • Downgrade the Theme: If a piece of content is causing conflict but is essential, consider altering its title or theme slightly to reduce direct competition with other content.
  • Internal Linking: Strengthen the authority of your primary content by linking to it from other related content using relevant anchor text.
  • Landing Page Creation: If you have multiple pieces of content around a similar theme, consider creating a hub page that acts as a central point, linking out to all related content. This establishes a clear hierarchy and signals to search engines which page is the primary focus.
  • Stay Updated: Regularly review and update your content. This ensures that it remains relevant, reduces redundancy, and minimises the risk of cannibalisation.

By understanding, identifying, and taking proactive measures against cannibalisation, you can safeguard your website’s visibility and maintain traffic.

Jon Earnshaw led a Whiteboard Friday for Moz discussing cannibalisation in more depth, I’d give it a watch if you’d like to take your cannibalisation knowledge to the next level:

Identifying, Fixing, and Preventing Cannibalization — Whiteboard Friday

SERP layout changes

Google’s SERP layouts on Mobile and Desktop are constantly evolving. With additional paid ads, new features, and legacy features being removed. This is a common cause for traffic drops and can often be overlooked. An example of this is the change that Google has made to HowTo and FAQ schema, as broken down by Search Engine Journal. This change received industry push back because of the traffic these features have been bringing in for site owners.

Firstly, look to the reliable sources to discover any new SERP layout changes – for example: Google themselves, trusted professionals such as Lily Ray and Aleyda Solis, and you can even set up or bookmark an RSS feed to keep you updated with the latest SEO news, I use one called SEO News Pro.

You can also use most SEO tools (SE Ranking, semrush, ahrefs, moz) to check which SERP features you are ranking for and understand any improvements or declines in performance. Here’s an example from Moz, they have a SERP features chart which allows you to filter the date range and SERP feature:

blZjpAhhmfplEXqtuCN W5Ff0F7qZRJPw6wuS5D20zgGe11PKlzq4MwTco34MDFhEIuwnd8wYbeF6uwWfYoCmhpmJTi hyI4hP5iUS7iugnb cRar2M3ZTHsAUNFc8D5mJaCvyUKg a1z4YRkseZM3w

Google’s advice? To prepare for structured data changes in general – as mentioned above, the landscape is constantly changing and overreliance on a certain feature can backfire down the road:

Ranking updates, structured data, and more! – Google Search News (October ‘23)

This means, you should produce your content with the user in mind – how can you give them what they need as quickly and clearly as possible? Don’t create pages for the sole purpose of optimising for SERP features and you will be less likely to get caught out when there are changes. As mentioned earlier, a reliable source to refer to when creating content is Creating helpful, reliable, people-first content from Google themselves. 

Industry trends

Analysing industry trends can be a quick way to see whether there is something out of your control causing drops in traffic. There are many potential causes for industry wide drops, such as seasonality, general changes in interest, and big industry players disrupting the norm. 

To expand: major industry leaders have the ability to reshape consumer habits through establishing new standards, introducing new terminology, and launching attractive products that shift attention away from traditional interests.

There are a few ways to check industry trends. One of the most reliable ways is with Google Trends and the Glimpse add on (this adds trendlines and highlights key stats). 

Here’s an example of a 5 year performance trendline for the keyword “house removals”:

Up3rGcMDwTFF9dixsXcfKpIWYDsSXghLMte2eXwM 1Fq9qOHdqbVaPKH zpqPKeI5fvI2QfxtE5w0oVPW8H3 qK 1JFYbTgb

I would advise checking the trends for a “Topic” VS checking the trends for a “Keyword”, as this will provide a more thorough view. You can adjust the dates to analyse trends at the period you experienced drops. If there’s a clear correlation then it is likely that the industry declines have played a role. 

To validate this further, check competitor performance during the same timeframe. If your main competitors have experienced drops as well, then it’s safe to assume that this is the cause – or at least one of the causes. 

Dealing with industry downturns isn’t a one-size-fits-all solution. Take seasonal trends – they’re pretty much expected in some cases. For example, a Christmas tree seller isn’t going to worry about summer drops. It’s all about being aware of these trends and factoring them into your reporting. This way, you can tell whether it’s time to act or sit back.

For other types of industry dips, there may be room to make a difference. If your business/website covers various products, services, or topics, you’ve got options. It might be smart to shift your energy from areas that are dipping in popularity towards those that are growing. 

Keep in mind, though, that switching focus depends on what your products, services, or topics are worth to you at the time, as well as what’s trending. Although, if you’re seeing steady declines, it’s worth looking into.

4. Give the fixes time to work

It’s worth noting that recovering traffic to your site won’t be instant. Quick results are possible (depending on the issue at hand), although improvement over time is much more common. 

For businesses where traffic is responsible for a large portion of revenue, this may not be what you want to hear, although it’s important nonetheless. 

Starting SEO on your website for the first time ever, is akin to starting exercise. It’s about putting in the effort and staying consistent. The results won’t be immediate, but over time, they’ll not only show, they’ll likely exceed your expectations.

Now, if your site has suffered a significant drop, think of it as returning to the gym after an injury. You’ll need a period of recovery and careful management before you’re back to building strength. The approach here is the same – through hard work and consistency, you can guide your site back to peak performance. 

Remember, setting realistic goals and expectations is key during this process.

5. Test and Refine

You won’t need to run through every check here each time something crops up – that would be massively time consuming. The aim is to continually build on your understanding of what makes your website tick. This way, integrating fixes becomes a seamless part of your workflows.

Logging the checks you’ve done and the insights you’ve gained is helpful practice for managing traffic performance moving forward. Here are a few reasons:

  • Noting down seasonal drops will prevent you from panicking that time each year. 
  • Keeping in tune with industry-related changes will help you remain agile and competitive. 
  • Knowing which technical issues your website has experienced in the past, and whether it is more susceptible to certain issues, will help you preempt and manage them. 
  • The same applies to Google updates and SERP changes. Keeping up to date with these will allow you to pivot into new areas that have not been impacting, or even better – will allow you to create an evergreen strategy that is less likely to be hit by new changes. 
  • Training your team on how to diagnose and resolve these issues in advance will allow them to quickly fix problems that could take attention away from other parts of the business.

Concluding remarks

Thanks for sticking with me! As mentioned at the start, I’ve put together a Website Traffic Recovery Checklist that lays out all the checks – feel free to use this at your convenience.

As an experienced SEO Consultant, this research and the steps provided reflect my day-to-day workflows. I offer training and workshops tailored to these tasks, plus much more. Equipping your team with the know-how to diagnose and tackle traffic issues promptly is hugely beneficial, for both resource and revenue.

If you think your team could benefit from customised training, don’t hesitate to get in touch. We can craft a bespoke plan that fits your business needs!

If there’s anything you think I’ve missed or if there’s something you disagree with, drop me a line at info@matt-kohli. I’m always open to feedback that sharpens my skills and enriches what I offer.
Matt Kohli
I’m Matt, a UK-based SEO Consultant. I’ve worked in-house, agency side, and I’ve freelanced for a number of years now. My goal is to help businesses like yours unlock their potential by crafting tailored strategies that drive lasting results!
Let's talk