15 Reasons Why Your Website Traffic Dropped

Why Your Website Traffic Dropped
Share:

Every website owner desires the growth of their traffic, and the ability to reach as many people as possible.

There’s a specific scene in “The Social Network” that comes to mind when IT techs predict that Harvard’s network is about to crash caused by too much traffic originating from Mark Zuckerberg, and that’s the kind of traffic everybody should be striving for.

In order to achieve high traffic, it is critical that you develop users who visit your site regularly.

However, we’re all vulnerable to dropping traffic. It will always have a negative impact on the website’s outlook, and on the brand represented by the website.

Here, I’ll be discussing the most common reasons why website traffic is experiencing a decline, and what can be done about it.

1. Search Engine Penalty

How you set up your website can have an effect on search engine results.

Keep your website fresh, active, and targeted with the most searched keywords to ensure that it stays atop the ranking list.

Search engine optimization algorithms such as Google’s can effectively weed out websites that use black hat SEO to boost ranking.

The most common way for a website to boost its SEO is to buy backlinks from spam networks.

Other reasons for a search engine penalty include:

  • duplicated content,
  • a large number of backlinks that were paid for (from suspicious sites).

2. Missing 301 Redirects

Search engines work by indexing web pages and use the URL of the website to rank them correctly.

If the URL is changed, 301 redirects must be installed to redirect users from the old URL to the new one.

The problem arises when the 301 redirect breaks (for whatever reason), and the old URL cannot be redirected to the new one.

These broken URLs will display a 404 error when they’re visited.

Check Google Search Console’s crawl reports if you want to browse through your website to find any issues like this one.

3. Algorithm Updates

Google is known for regular updates throughout the year. Some updates are more important than others.

However, it’s quite difficult to get solid details on the updates.

Try to find correlations between the updates and the drop in traffic (if there is one), and don’t let the same happen to your site.

4. Tracking Errors

Some site owners pull their tracking code from their sites, doing one thing you are never supposed to do.

However, that’s a mistake that can be easily fixed.

The bad news is – you’ll be missing out on some data (in the long run), so it’d be best to spot this and resolve it as quickly as possible.

You’ll notice that there are no sessions recorded by Google Analytics, and that means that there’s a possibility of the code not being present and correct.

If so, contact your developers, they will resolve the issue.

5. Ranking Losses

Not all declines in traffic have to be caused by errors and mistakes. There are organic reasons for why your traffic may have dropped.

You’re mostly likely tracking your traffic through a rank tracker (and if you’re not, you should) and you’ve noticed a drop in traffic.

Now, you can use this tool of why your raking changed.

You should identify when exactly did the rank start to drop, and take an export of the ranking keywords before and after the drop.

Use Excel to paste the data side by side for comparison. Compare the change in positions and retarget dropped terms with keyboard research and word mapping.

There are also alternative tools you can use to identify and solve this issue, tools like SISTRIX, for example.

6. Keyword Cannibalization

This is actually a sort of ‘reverse issue’, and it occurs when you create a lot of new content surrounding the same (or just a similar) topic.

You’ve just written and posted three similar articles, with all three roughly revolving around the same keyword, and you’ve taken keyword targeting in mind while you were writing the articles.

Keyword cannibalization is an occurrence in which a website appears for a keyword with multiple URLs.

This means that even though a user was searching for a specific article you’ve posted on the topic, they were shown all three articles.

If traffic spreads across multiple pages, you’ll be losing valuable organic traffic.

To fix this, you can use BigMetrics.io and the cannibalization report.

7. SERP Layout Changes

Google has recently changed the way they display organic results, so it’s crucial to adapt to these changes if you want to remain within the top results.

Featured Snippets, Knowledge graphs, and ads have become more prominent and are considered a priority. This has, understandably, frustrated SEO professionals.

Before you see any sign of an organic result, you need to compete with ads, knowledge graphs, snippets, and Google’s suggestions.

To resolve this, you need to analyze the keywords you’re targeting – it’s possible that they weren’t triggering SERP features, but they’re doing so now.

That means that the keywords from your site may be triggering features that don’t include your site, so you’re losing valuable visits.

This way, you’re basically used as bait to attract interest in someone else’s site.

8. URL De-indexing

This issue has only been noticed lately, with Google reporting a ‘de-indexing’ bug via Twitter.

The bug causes important pages to appear to be removed from their sites, seemingly overnight.

However, despite becoming a public issue lately, this was a problem even earlier.

When investigating this, it’s crucial to find the URLs that are no longer available for searches on search engines.

You need to check the index coverage report in Search Console for any errors, use a URL inspection tool, and make sure that the important pages are still properly indexed, and if they’re not, use the ‘Request Indexing’ option from the Search Console.

9. Manual Penalties

A penalty may be issued against your site if the content on it goes against Google’s guidelines. Google actually employs reviewers, actual people – not robots.

Their job is to review a bunch of websites for this reason.

Here are Google’s official principles and guidelines:

“Make pages primarily for users, not for search engines. Don’t deceive your users. Avoid tricks intended to improve search engine rankings.

A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee.

Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”

Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.“

10. XML Sitemap Changes

A change in your XML sitemap may be the reason for the recent drop you’ve witnessed in your traffic.

URLs that return a 200 response and are indexable should be visible in your XML Sitemap, other URLs shouldn’t (unless you left them there on purpose for redirection).

To see if there have been any changes that may possibly be harmful, you should crawl the sitemap URLs and make sure that they return a 200 response.

If there’s an issue of this kind, use the Search Console to regenerate and resubmit the sitemap.

11. Crawl Errors

Using the Search Console, open the Index Coverage Report and check for URLs that have an error.

Any URL included in the coverage report that has an error associated with them won’t be included in the index.

Typical errors include server errors, redirect errors, URLs blocked by robots.txt, URLs marked with a noindex tag, soft 404 errors, URLs returning an unauthorized request, unlocatable URLs (they usually return a 404 error), crawling errors.

You can find more on these errors on Google’s official site, as well as solutions to these problems and a more detailed list.

12. Incorrect robots.txt Rules

There’s always a possibility that your site is blocking search engines from crawling in the robots.txt file.

Developers often leave robots.txt files unchanged after migrating from a development or staging website. Usually, it’s by accident.

With just one line of robots.txt code, you can instruct search engine bots to disallow an entire domain, removing it from search results.

This usually takes effect a day or two after doing this, but the effect it has on your website can be devastating.

It’s usually done when the website is being transferred from a hidden, staging domain onto the main domain, and the file is usually taken by accident.

The line in the file is usually as follows:

User-agent: *

Disallow: /

Sitemap: https://www.example.com/sitemap.xl

13. Low-quality Content

Content is king – this saying isn’t so popular amongst internet businesses without reason.

Without great content, it’s impossible to engage your users and Google won’t rank you highly.

This principle is very well known, but despite it, there are still websites creating very underdeveloped, basic, 500 word-long articles, which ultimately have no value.

You need to determine your content’s value.

It’d be best if the articles are written by experts, you need to make sure that they don’t have any mistakes, the content needs to be original and actually bring genuine information for users, and the articles need to be complete and comprehensive.

The structure of your content is vastly important, as well. Readers rarely have the time to read the whole article, so they’ll just scan it for the content they’re looking for.

14. Low-quality Website

This insert is pretty straightforward if you ask me.

We’ve all witnessed terribly designed websites and websites created at the beginning of this century, still active today, but have never received a redesign.

This has a massive effect on your SEO, which automatically affects your rankings and Google traffic, and very importantly – your conversion rate.

The latter is due to how much users trust your business based on your website’s visual appeal, ease of use, and authoritative content.

Your website’s quality is determined by usability, experience, approachability, design, information architecture, and most importantly – content.

This is why hiring a developer to create your website is usually the best way to go, as there are whole teams of professionals dedicated to analyzing websites, determining what’s wrong with them, and fixing those issues.

15. Tracking Code Errors

It’s always possible that there’s a problem with the code.

It can happen that there’s just a piece of code missing.

You need to make sure to have your Google Analytics tracking code implemented on every page of your website.

You can use Gachecker to check your entire site and make sure you’re not missing any code.

Incorrect snippets are another problem. There’s the possibility that you’re using the wrong snippet for your property.

To find your GA tracking code: Admin -> Tracking Info -> Tracking Code.

Also, there’s the possibility of having extra characters or whitespace.

To fix this, make sure that you’ve copied your GA code and pasted it directly onto your website using an editor that preserves code formatting. If you do it any other way, you may be risking this problem.

Make sure you’re not running multiple instances of the classic GA tag (ga.js) on a website.

In addition, make sure you’re not running multiple Universal Analytics (analytics.js) codes with the same UA-ID on your website.

It can result in inaccurate data collection, processing, or reporting and you’ll have an extremely low bounce rate (less than 10-15%) in Google Analytics.

Here are some other issues that may occur, but haven’t made the list: over-optimization, no transition plan, link spam, blocking your own website when redesigning and updating, Google’s internal issues.

Of course, it’s completely impossible to predict problems that may occur – the Internet has become borderline infinite within itself, and new issues are reported daily and are tackled by developers.

Share: