15 Reasons Why Your Website Traffic Dropped
Every website owner desires the growth of their traffic, and the ability to reach as many people as possible.
Thereโs a specific scene in โThe Social Networkโ that comes to mind when IT techs predict that Harvardโs network is about to crash caused by too much traffic originating from Mark Zuckerberg, and thatโs the kind of traffic everybody should be striving for.
In order to achieve high traffic, it is critical that you develop users who visit your site regularly.
However, weโre all vulnerable to dropping traffic. It will always have a negative impact on the websiteโs outlook, and on the brand represented by the website.
Here, I’ll be discussing the most common reasons why website traffic is experiencing a decline, and what can be done about it.
1. Search Engine Penalty
How you set up your website can have an effect on search engine results.
Keep your website fresh, active, and targeted with the most searched keywords to ensure that it stays atop the ranking list.
Search engine optimization algorithms such as Googleโs can effectively weed out websites that use black hat SEO to boost ranking.
The most common way for a website to boost its SEO is to buy backlinks from spam networks.
Other reasons for a search engine penalty include:
- duplicated content,
- a large number of backlinks that were paid for (from suspicious sites).
2. Missing 301 Redirects
Search engines work by indexing web pages and use the URL of the website to rank them correctly.
If the URL is changed, 301 redirects must be installed to redirect users from the old URL to the new one.
The problem arises when the 301 redirect breaks (for whatever reason), and the old URL cannot be redirected to the new one.
These broken URLs will display a 404 error when theyโre visited.
Check Google Search Consoleโs crawl reports if you want to browse through your website to find any issues like this one.
3. Algorithm Updates
Google is known for regular updates throughout the year. Some updates are more important than others.
However, itโs quite difficult to get solid details on the updates.
Try to find correlations between the updates and the drop in traffic (if there is one), and donโt let the same happen to your site.
4. Tracking Errors
Some site owners pull their tracking code from their sites, doing one thing you are never supposed to do.
However, thatโs a mistake that can be easily fixed.
The bad news is โ youโll be missing out on some data (in the long run), so itโd be best to spot this and resolve it as quickly as possible.
Youโll notice that there are no sessions recorded by Google Analytics, and that means that thereโs a possibility of the code not being present and correct.
If so, contact your developers, they will resolve the issue.
5. Ranking Losses
Not all declines in traffic have to be caused by errors and mistakes. There are organic reasons for why your traffic may have dropped.
Youโre mostly likely tracking your traffic through a rank tracker (and if youโre not, you should) and youโve noticed a drop in traffic.
Now, you can use this tool of why your raking changed.
You should identify when exactly did the rank start to drop, and take an export of the ranking keywords before and after the drop.
Use Excel to paste the data side by side for comparison. Compare the change in positions and retarget dropped terms with keyboard research and word mapping.
There are also alternative tools you can use to identify and solve this issue, tools like SISTRIX, for example.
6. Keyword Cannibalization
This is actually a sort of โreverse issueโ, and it occurs when you create a lot of new content surrounding the same (or just a similar) topic.
Youโve just written and posted three similar articles, with all three roughly revolving around the same keyword, and youโve taken keyword targeting in mind while you were writing the articles.
Keyword cannibalization is an occurrence in which a website appears for a keyword with multiple URLs.
This means that even though a user was searching for a specific article youโve posted on the topic, they were shown all three articles.
If traffic spreads across multiple pages, youโll be losing valuable organic traffic.
To fix this, you can use BigMetrics.io and the cannibalization report.
7. SERP Layout Changes
Google has recently changed the way they display organic results, so itโs crucial to adapt to these changes if you want to remain within the top results.
Featured Snippets, Knowledge graphs, and ads have become more prominent and are considered a priority. This has, understandably, frustrated SEO professionals.
Before you see any sign of an organic result, you need to compete with ads, knowledge graphs, snippets, and Googleโs suggestions.
To resolve this, you need to analyze the keywords youโre targeting โ itโs possible that they werenโt triggering SERP features, but theyโre doing so now.
That means that the keywords from your site may be triggering features that donโt include your site, so youโre losing valuable visits.
This way, youโre basically used as bait to attract interest in someone elseโs site.
8. URL De-indexing
This issue has only been noticed lately, with Google reporting a โde-indexingโ bug via Twitter.
The bug causes important pages to appear to be removed from their sites, seemingly overnight.
However, despite becoming a public issue lately, this was a problem even earlier.
When investigating this, itโs crucial to find the URLs that are no longer available for searches on search engines.
You need to check the index coverage report in Search Console for any errors, use a URL inspection tool, and make sure that the important pages are still properly indexed, and if theyโre not, use the โRequest Indexingโ option from the Search Console.
9. Manual Penalties
A penalty may be issued against your site if the content on it goes against Googleโs guidelines. Google actually employs reviewers, actual people โ not robots.
Their job is to review a bunch of websites for this reason.
Here are Googleโs official principles and guidelines:
โMake pages primarily for users, not for search engines. Don’t deceive your users. Avoid tricks intended to improve search engine rankings.
A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee.
Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.โ
10. XML Sitemap Changes
A change in your XML sitemap may be the reason for the recent drop youโve witnessed in your traffic.
URLs that return a 200 response and are indexable should be visible in your XML Sitemap, other URLs shouldnโt (unless you left them there on purpose for redirection).
To see if there have been any changes that may possibly be harmful, you should crawl the sitemap URLs and make sure that they return a 200 response.
If thereโs an issue of this kind, use the Search Console to regenerate and resubmit the sitemap.
11. Crawl Errors
Using the Search Console, open the Index Coverage Report and check for URLs that have an error.
Any URL included in the coverage report that has an error associated with them wonโt be included in the index.
Typical errors include server errors, redirect errors, URLs blocked by robots.txt, URLs marked with a noindex tag, soft 404 errors, URLs returning an unauthorized request, unlocatable URLs (they usually return a 404 error), crawling errors.
You can find more on these errors on Googleโs official site, as well as solutions to these problems and a more detailed list.
12. Incorrect robots.txt Rules
Thereโs always a possibility that your site is blocking search engines from crawling in the robots.txt file.
Developers often leave robots.txt files unchanged after migrating from a development or staging website. Usually, itโs by accident.
With just one line of robots.txt code, you can instruct search engine bots to disallow an entire domain, removing it from search results.
This usually takes effect a day or two after doing this, but the effect it has on your website can be devastating.
Itโs usually done when the website is being transferred from a hidden, staging domain onto the main domain, and the file is usually taken by accident.
The line in the file is usually as follows:
User-agent: *
Disallow: /
Sitemap: https://www.example.com/sitemap.xl
13. Low-quality Content
Content is king โ this saying isnโt so popular amongst internet businesses without reason.
Without great content, itโs impossible to engage your users and Google wonโt rank you highly.
This principle is very well known, but despite it, there are still websites creating very underdeveloped, basic, 500 word-long articles, which ultimately have no value.
You need to determine your contentโs value.
Itโd be best if the articles are written by experts, you need to make sure that they donโt have any mistakes, the content needs to be original and actually bring genuine information for users, and the articles need to be complete and comprehensive.
The structure of your content is vastly important, as well. Readers rarely have the time to read the whole article, so theyโll just scan it for the content theyโre looking for.
14. Low-quality Website
This insert is pretty straightforward if you ask me.
Weโve all witnessed terribly designed websites and websites created at the beginning of this century, still active today, but have never received a redesign.
This has a massive effect on your SEO, which automatically affects your rankings and Google traffic, and very importantly โ your conversion rate.
The latter is due to how much users trust your business based on your websiteโs visual appeal, ease of use, and authoritative content.
Your websiteโs quality is determined by usability, experience, approachability, design, information architecture, and most importantly โ content.
This is why hiring a developer to create your website is usually the best way to go, as there are whole teams of professionals dedicated to analyzing websites, determining whatโs wrong with them, and fixing those issues.
15. Tracking Code Errors
Itโs always possible that thereโs a problem with the code.
It can happen that thereโs just a piece of code missing.
You need to make sure to have your Google Analytics tracking code implemented on every page of your website.
You can use Gachecker to check your entire site and make sure youโre not missing any code.
Incorrect snippets are another problem. Thereโs the possibility that youโre using the wrong snippet for your property.
To find your GA tracking code: Admin -> Tracking Info -> Tracking Code.
Also, thereโs the possibility of having extra characters or whitespace.
To fix this, make sure that youโve copied your GA code and pasted it directly onto your website using an editor that preserves code formatting. If you do it any other way, you may be risking this problem.
Make sure youโre not running multiple instances of the classic GA tag (ga.js) on a website.
In addition, make sure youโre not running multiple Universal Analytics (analytics.js) codes with the same UA-ID on your website.
It can result in inaccurate data collection, processing, or reporting and youโll have an extremely low bounce rate (less than 10-15%) in Google Analytics.
Here are some other issues that may occur, but havenโt made the list: over-optimization, no transition plan, link spam, blocking your own website when redesigning and updating, Googleโs internal issues.
Of course, itโs completely impossible to predict problems that may occur โ the Internet has become borderline infinite within itself, and new issues are reported daily and are tackled by developers.
