15 Reasons Why Your Website Traffic Dropped
Blog

15 Reasons Why Your Website Traffic Dropped

t’s in the best of interests for every website owner to have their traffic grow, and be as large as possible. In the film “The Social Network”, there’s a specific scene that comes to mind, when IT techs predict that Harvard’s network is about to crash from too much traffic caused by Mark Zuckerberg – that’s the kind of traffic everyone should be aiming at.

Developing a network of faithful users who will visit your website regularly, is absolutely crucial when trying to achieve high traffic. This will ultimately lead to even more popularity, as that sort of growth can be described as exponential – more and more visitors means more and more people sharing your website, as well. This will ultimately lead to more profit.

However, not everyone’s a technological genius, and we’re all vulnerable to dropping traffic. This will, naturally, always have a negative effect on our website’s outlook, and on the brand represented by the website. It’s important to try to balance your traffic and get it back up and running, but it’s understandable that sometimes you may seem powerless in comparison to the computer before your eyes.

In this article, we’ll be facing the problems of website traffic that’s experiencing a drop, clearly describing the most common reasons why this would happen. Let’s get started.

1. Search Engine Penalty

Search engine results can be affected by the way you set your website up. Because of SEO (Search Engine Optimization), search engines rank your site higher or lower, depending on what your score is. Keeping your website fresh, active, and matching with the most searched for keywords will ensure that it stays atop the ranking list.

Google, most notably, has an amazing algorithm that can effectively weed out websites that increase SEO artificially. These websites usually have unnatural SEO techniques and content that users won’t be happy with if after making a search resulting in showing that website. Websites can increase their SEO artificially in many ways, most notably paying for backlinks on spam networks.

Some other reasons why you might suffer a search engine penalty are:

– poor, duplicated content (most often auto-generated content created by article generators) – Google is great at noticing this, as article generators will never be able to replace human writing

– a large number of deliberately placed external backlinks

– keyword spamming within content – this is something that content writers often do in order to raise the website’s visibility in searches when someone searches that specific keyword – Google will recognize this as spamming and it will result in a penalty

2. Missing 301 Redirects

This is pretty simple. Search engines work by indexing web pages and use the URL of the website to rank them correctly. When the URL is changed, it’s then required to install a 301 redirect to forward users from the old URL to the new one. Problems occur when the 301 redirect is broken (for whatever reason), and when it can’t redirect the new URL to the old one.

These broken URLs will display a 404 error when they’re visited. If you want to browse through your website to try and find any issue like this one – try using Google Search Console’s crawl reports.

3. Algorithm Updates

Google is known for regular updates throughout the year, with some updates more important than others. However, it’s quite difficult to get solid details on the updates. The problem occurs when the update causes unpredicted issues for the website. Not all websites are built the same, and a lot of updates can have a very buggy effect on your website, while it can work just fine on another.

The best way to solve this problem is to predict whether the update is going to have a bad effect on the site, and if that’s the case – just don’t update. The best way to predict this is by using a tool such as Mozcast or Algoroo. Implementing those, you’ll be able to analyze the updates and how have they affected other sites. Try to find correlations between the updates and the drop in traffic (if there is one), and don’t let the same happen to your site.

If you have already updated and you’re suspecting that you’re seeing a drop in traffic because of this, try taking a look at confirmed changes about algorithm updates from Google.

4. Tracking Errors

It’s crazy that there are still site owners who pull their tracking codes from the site, doing one thing you’re definitely not supposed to do. These same owners then wonder why their traffic took a head-first dive.

However, that’s a mistake that can be easily fixed. The bad news is – you’ll be missing out on some data (in the long run), so it’d be best to spot this and resolve it as quickly as possible.

You’ll notice that there are no sessions recorded by Google Analytics, and that means that there’s a possibility of the code not being present and correct. If so, contact your developers, they will resolve the issue.

5. Ranking Losses

Not all declines in traffic have to be caused by errors and mistakes. There are organic reasons for why your traffic may have dropped.

You’re mostly likely tracking your traffic through a rank tracker (and if you’re not, you should) and you’ve noticed a drop in traffic. Now, you can use this tool of why your raking changed.

You should identify when exactly did the rank start to drop, and take an export of the ranking keywords before and after the drop. Use Excel to paste the data side by side for comparison. Compare the change in positions and retarget dropped terms with keyboard research and word mapping.

There are also alternative tools you can use to identify and solve this issue, tools like SISTRIX, for example.

6. Keyword Cannibalization

This is actually a sort of ‘reverse issue’, and it occurs when you create a lot of new content surrounding the same (or just a similar) topic. You’ve just written and posted three similar articles, with all three roughly revolving around the same keyword, and you’ve taken keyword targeting in mind while you were writing the articles.

Keyword cannibalization is an occurrence in which a website appears for a keyword with multiple URLs. This means that even though a user was searching for a specific article you’ve posted on the topic, they were shown all three articles.

If traffic spreads across multiple pages, you’ll be losing valuable organic traffic. To fix this, you can use BigMetrics.io and the cannibalization report.

7. SERP Layout Changes

Google has recently changed the way they display organic results, so it’s crucial to adapt to these changes if you want to remain within the top results. Featured Snippets, Knowledge graphs, and ads have become more prominent and are considered a priority. This has, understandably, frustrated SEO professionals.

Before you see any sign of an organic result, you need to compete with ads, knowledge graphs, snippets, and Google’s suggestions.

To resolve this, you need to analyze the keywords you’re targeting – it’s possible that they weren’t triggering SERP features, but they’re doing so now. That means that the keywords from your site may be triggering features that don’t include your site, so you’re losing valuable visits. This way, you’re basically used as bait to attract interest in someone else’s site.

8. URL De-indexing

This issue has only been noticed lately, with Google reporting a ‘de-indexing’ bug via Twitter. The bug causes important pages to appear to be removed from their sites, seemingly overnight. However, despite becoming a public issue lately, this was a problem even earlier.

When investigating this, it’s crucial to find the URLs that are no longer available for searches on search engines.

You need to check the index coverage report in Search Console for any errors, use a URL inspection tool, and make sure that the important pages are still properly indexed, and if they’re not, use the ‘Request Indexing’ option from the Search Console.

9. Manual Penalties

A penalty may be issued against your site if the content on it goes against Google’s guidelines. Google actually employs reviewers, actual people – not robots. Their job is to review a bunch of websites for this reason.

Here are Google’s official principles and guidelines:

Make pages primarily for users, not for search engines. Don’t deceive your users. Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?” Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.“

“Automatically generated content

Participating in link schemes

Creating pages with little or no original content

Cloaking

Sneaky redirects

Hidden text or links

Doorway pages

Scraped content

Participating in affiliate programs without adding sufficient value

Loading pages with irrelevant keywords

Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware

Abusing structured data markup

Sending automated queries to Google“

10. XML Sitemap Changes

A change in your XML sitemap may be the reason for the recent drop you’ve witnessed in your traffic. URLs that return a 200 response and are indexable should be visible in your XML Sitemap, other URLs shouldn’t (unless you left them there on purpose for redirection).

To see if there have been any changes that may possibly be harmful, you should crawl the sitemap URLs and make sure that they return a 200 response. If there’s an issue of this kind, use the Search Console to regenerate and resubmit the sitemap.

11. Crawl Errors

Using the Search Console, open the Index Coverage Report and check for URLs that have an error. Any URL included in the coverage report that has an error associated with them won’t be included in the index.

Typical errors include server errors, redirect errors, URLs blocked by robots.txt, URLs marked with a noindex tag, soft 404 errors, URLs returning an unauthorized request, unlocatable URLs (they usually return a 404 error), crawling errors.

You can find more on these errors on Google’s official site, as well as solutions to these problems and a more detailed list.

12. Incorrect robots.txt Rules

There’s always a possibility that your site is blocking search engines from crawling in the robots.txt file. Developers often leave robots.txt files unchanged after migrating from a development or staging website. Usually, it’s by accident.

With just one line of robots.txt code, you can instruct search engine bots to disallow an entire domain, removing it from search results. This usually takes effect a day or two after doing this, but the effect it has on your website can be devastating.

It’s usually done when the website is being transferred from a hidden, staging domain onto the main domain, and the file is usually taken by accident. The line in the file is usually as follows:

User-agent: *

Disallow: /

Sitemap: https://www.example.com/sitemap.xl

13. Low-quality Content

Content is king – this saying isn’t so popular amongst internet businesses without reasons. Without great content, it’s impossible to engage your users and Google won’t rank you highly. This principle is very well known, but despite it, there are still websites creating very underdeveloped, basic, 500 word-long articles, which ultimately have no value.

You need to determine your content’s value. It’d be best if the articles are written by experts, you need to make sure that they don’t have any mistakes, the content needs to be original and actually bring genuine information for users, and the articles need to be complete and comprehensive.

The structure of your content is vastly important, as well. Readers rarely have the time to read the whole article, so they’ll just scan it for the content they’re looking for.

14. Low-quality Website

This insert is pretty straightforward if you ask me. We’ve all witnessed terribly designed websites and websites created at the beginning of this century, still active today, but have never received a redesign. This has a massive effect on your SEO, which automatically affects your rankings and Google traffic, and very importantly – your conversion rate.

The latter is due to how much users trust your business based on your website’s visual appeal, ease of use, and authoritative content. Your website’s quality is determined by usability, experience, approachability, design, information architecture, and most importantly – content.

This is why hiring a developer to create your website is usually the best way to go, as there are whole teams of professionals dedicated to analyzing websites, determining what’s wrong with them, and fixing those issues.

15. Tracking Code Errors

It’s always possible that there’s a problem in the code.

It can happen that there’s just a piece of code missing. You need to make sure to have your Google Analytics tracking code implemented on every page of your website. You can use Gachecker to check your entire site and make sure you’re not missing any code.

Incorrect snippets are another problem. There’s the possibility that you’re using the wrong snippet for your property. To find your GA tracking code: Admin -> Tracking Info -> Tracking Code.

Also, there’s the possibility of having extra characters or whitespace. To fix this, make sure that you’ve copied your GA code and pasted it directly onto your website using an editor that preserves code formatting. If you do it any other way, you may be risking this problem.

Make sure you’re not running multiple instances of the classic GA tag (ga.js) on a website. In addition, make sure you’re not running multiple Universal Analytics (analytics.js) codes with the same UA-ID on your website. It can result in inaccurate data collection, processing, or reporting and you’ll have an extremely low bounce rate (less than 10-15%) in Google Analytics.

Here are some other issues that may occur, but haven’t made the list: over-optimization, no transition plan, link spam, blocking your own website when redesigning and updating, Google’s internal issues.

Of course, it’s completely impossible to predict problems that may occur – the Internet has become borderline infinite within itself, and new issues are reported daily and are tackled by developers. We’ve only listed some of the most common ones in this article, but if none of this helped, then it may be best to look for professional help.

Similar Posts