Therefore the purpose of this article to make everything perfectly clear for you connected to it: how organic traffic is generated, why it is better in short- and long-term than paid traffic, how rankings affect it and what are the exact reasons if it drops.

Below you can find step-by-step guides to fix errors on your Magento site and eliminate malicious backlinks that might plague your site, among many other things.

Let’s start with the basics and proceed from there.

What is Organic Traffic


Organic traffic is a metric in online marketing, it refers to the volume of visitors who arrive on your site by clicking on organic (non-paid) search results.

It’s opposite is paid traffic, referring to visitors who land on a website by clicking on paid advertisements (e.g. in Google or on Facebook). Referred visitors (who click on links on other websites and are directed to your site) are not considered organic traffic.

GOOD TO KNOW: If the portion of organic traffic is low on your site that means that in one way or another you are paying more for bringing visitors, potential leads and customers in than you should.

Acquiring organic traffic is not free by any means – but it is a more cost-efficient and sustainable model to reach your audience.

The best way to increase the volume of organic traffic is SEO (search engine optimization) based content marketing: researching how and why members of your target audience use different search terms and creating and publishing in-depth, valuable content to answer their questions.

For this you can use:

  • Skyscraper articles (long-form, 5-10K word content answering a wide range of questions regarding a well-defined niche topic).
  • Expert roundups (asking questions of well-known experts and influencers in your niche to gain visibility and backlinks).
  • Interviews (selecting a well-known expert in the niche who can lend credibility and visibility to your site as well as drive organic traffic and gain backlinks through citations).
  • Products reviews or comparisons (targeting purchase and information-gathering intentions of your target audience).
  • Original research (based on data you have about your customers, their behavior, market trends that might be interesting for your audience etc.).
  • How-to articles and guides (guiding people to solve problems relevant to your product and niche step-by-step in order to gain positive emotional feedback through solving these problems for them).
  • Guest posts (published on external sites in order to gain backlinks, gain visibility and reach more of your niche while directing referral traffic to your site).
Content Types

You may have noticed that in the list we only included written content. Videos, infographics, podcasts and similar content can also be effective, however, they need additional written content to support them in order to rank well because this is what crawlers of Google can understand.

Pro tip: Now if your site has no organic traffic the first step should be making sure the technical aspects are in order - it is indexed, crawlers can understand it’s content, internal linking is not broken and so on.

Then you should take the time and do keyword research, based on which you can come up with a plan for content creation.

If you take the time to do the research you can expect your content to be found by people searching for answers – and ranking better and better with time on search result pages (SERPs), driving even more organic traffic to your site by reaching more people via Google.

What is the difference between organic and direct traffic?


Direct traffic indicates the visitors that show up in Analytics as not referred to your site by other sites, search engines or other sources like email or social.

organic traffic vs direct traffic

The simplest way this can happen is when a user manually enters your URL or opens your site using a bookmark. Unfortunately, however, this is not a general rule and analyzing direct traffic can get confusing.

For one, a portion of traffic that is indicated as direct in your analytics is more than likely organic traffic. Groupon tested this theory in 2014 when they de-indexed their entire site for a period of 6 hours. During this time they were not showing up in any searches, but this should have not affected direct traffic. It did.

Note: Direct traffic decreased so much that they concluded as much as 60% of it could be actually organic search traffic.

Dark traffic makes it even more complicated to decipher its true origin.

The name is fitting: it is dark because we can’t see where it comes from. We know that a portion of what is showing up as direct is actually not, but analytics simply can’t see its source. There can be a number of reasons for this:

  • Clicks in apps and software. Most apps don’t pass referring information.
  • Shares in closed social groups.
  • Shares via messaging.
  • Email clients not referring information. Certain clients like Outlook or Thunderbird often don’t rely the source of the click.
  • Clicks in password-protected areas.
  • HTTPS to HTTP referrals. The protocol works in a way that links from sercure to non-secure site don’t carry over the referral information.

It can also be misleading if you don’t filter out certain types of traffic – like traffic originating from your employees.

Pro tip: If you want to find out approximately how much of your direct traffic is actually organic or social you could try running tests like Groupon did.

It is important to at least have an idea about the approximate proportions because it can largely influence how you see the effectiveness of your campaigns. You can be attracting way more organic traffic or email click-throughs than you think – but maybe you don’t see it.

Paid or Organic Traffic?


Traffic brings you leads. And because of that, you need traffic that is sustainable and scalable.

Paid traffic comes from advertisements that you run on Google or Facebook, and it can be very efficient. Highly targeted campaigns can bring you hundreds of potential customers per campaign with a relatively low acquisition cost.

However, there are certain problems with paid traffic.

First: it is scalable, but your expenses will increase proportionately. Relying on ads means that you won’t be able to effectively bring down acquisition costs.

increasing expenses

Second: it is an unsustainable model. If you stop running your campaigns the traffic simply stops.

decreasing traffic

The nice thing about properly executing an inbound strategy that focuses on gaining organic traffic is that scaling is not an option – it’s an attribute.

It takes time and money to create, publish an distribute content – but unlike an ad, it drives you traffic in the long term. A paid campaign requires that you constantly pay to keep it active. A piece of optimized evergreen content can attract leads for years without spending a dime on it after it is published.

This makes it a sustainable method: even if you only have the budget to create a small library of guides but can’t keep up content creation it will still be a valuable asset, passively supporting your online strategy.

That being said, it is very important to at least refresh your content constantly, but even better if you can keep publishing new and relevant content – users and Google both like sites that provide fresh knowledge.

SEM vs SEO


I can’t really answer “what is the difference between SEM and SEO”, simply because SEO is a part of SEM, or search engine marketing.

Pro tip: SEO focuses on gaining organic traffic through optimizing your site and creating targeted, valuable content.

SEM, however, includes other methods like running PPC campaigns in Google (which is called paid search advertising).

As in this article we are not focusing on paid campaigns, we are using the term SEO to describe all the methods and tactics we share with you.

Google Algorithm Changes


Algorithm changes are fairly common at Google – usually they happen multiple times in a day. Most of these changes have little effects on the overall rankings, however, from time to time, there are some major core algorithm updates that can severely alter rankings for millions of sites.

Note: The last we know about was the so-called “Medic” Core Update on August 1, 2018.

It affected sites mostly (but not exclusively) on the health, wellness, lifestyle markets, hence the name.

The effects were huge for many sites, some seeing their rankings skyrocket, other loosing more than half of their organic traffic almost overnight.

These kinds of broad core updates are rare, but even smaller ones can affect your site because of minor optimization or technical issues for example. So if you see a sudden drop or spike in your organic traffic and rankings, be sure to check what is Google currently up to.

Pro tip: There is no official site where Google announces their updates, there are however sites where you can check the latest ones like Moz and Search Engine Journal.

Layered navigation: too many indexed sites


Here is something that I encounter daily: indexing every single page on your site – including automatically generated pages for search results and filters.

On platforms like Magento these pages tend to multiply like there is no tomorrow: for every unique search or combination of attributes (like size, price, weight and so on) a new page is generated.

The problem is: If you let Google crawlers access and index these pages, the algorithm will see thousands (possibly hundreds of thousands) of pages with very low-quality and very similar content.

However valuable other content is on your site, it basically won’t matter, as it will be lost among them.

Let me show you an example. Using the database of BuiltWith I chose a random Magento site, Efootwear.eu.

I ran a simple search in google using the site: attribute to find out how many indexed pages they have. It turns out they have a lot:

Now, 204K indexed pages is not necessarily a sign of an error: there are sites out there that sell thousands of products to dozens of different countries. If you have ~8000 different products, like auto parts and ship to 30 different countries, you may have this many pages in different languages even without including filters.

Here this is not the case, however: Eufootwear.eu beautifully demonstrates how it could blow up the number of your indexed pages if you forget to noindex search results.

There are a large number of products on the site, so naturally, you can use different filters to find exactly what you are looking for.

The problem is: developers of the site did not noindex result pages, so now the results for every single combination of every single filter is indexed by Google:

Again, this is a huge problem for multiple reasons.

  1. Google sees hundreds of thousands of almost identical pages on the site with very little content and attracting 0 organic traffic, making your overall statistics much, much worse.
  2. Google has a limit for their crawlers – you can call it crawler budget. This determines how many sites are crawled daily on each site. If there is valuable content being published on your site but your indexing settings are not right, it might not even get noticed, because the search engine will just crawl thousands of internal search results instead.
  3. If they had outbound links in the footer (luckily they don’t) they would also hurt the rankings of the linked site for the same reason: the site would get hundreds of thousands of identical links from a single site, clearly signaling to Google that they are artificial.

How to fix this?

There are two basic things you have to do to avoid this.

Use canonical tags

Implement the canonical tags on all category pages (and other pages where filters might be used), so Google is going to know that further generated filter pages are variants of that page and even pass link value to the canonical category page.

This could be set up in the code of the store fairly easily. But this might not be enough if you already have a store set up with thousands of unnecessarily indexed pages. In this case, you probably have to use the Remove URLs tool in Google Search Console.

Disallow crawling in Robots.txt

The second thing you should do is disallowing the crawling of such pages in your Robots.txt.

In most cases this is fairly simple: for example if filters are appearing in the URLs with a “?” character included, simply include this line in the file:

Disallow: /*?

Of course, there could be other certain things you have to noindex, like if the dir phrase appears in the URLs:

We advise that you at least consult a developer before starting to noindex your pages and remove URLs however – by simple mistake you could do more harm than good if you hide more pages from Google that you wanted.

The best solution is to ask an SEO expert what exactly the problem is and then relaying that information to your developer so they can sort it out.

Malicious backlinks


Your rankings can see a drop if there are malicious backlinks pointing to your site. This can happen in a number of ways – most often because of artificially generated low-quality backlinks or maybe because a competitor is trying to harm your backlink profile.

This is one of the reasons why it is important to constantly check your backlink profile, so you notice these kinds of malicious backlinks and can get rid of them.

I’ll walk you through the process step by step:

Step 1: Check your backlink profile

There are multiple ways to do this, the simplest is probably if you do it on Google, but not the best. This is how you can do it in the Search Console as per Google:

  1. Choose the site you want on the Search Console home page.
  2. On the Dashboard, click Search Traffic, and then click Links to Your Site.
  3. Under Who links the most, click More.
  4. Click Download more sample links. If you click Download latest links, you’ll see dates as well.

You can find more info here.

Important: I strongly recommend that you check your profile using a professional SEO tool.

For this our favorite is Ahrefs. To show you the process we are going to use one of our clients, CanadaHair as an example.

First, open up the Site Explorer tool and click on Referring domains to see which sites are linking to yours.

Step 2: Filter domains that are low-quality or outright suspicious

Usually, these are the ones with a very low Domain Rating, but a very low organic traffic or an unusually high amount of backlinks pointing to your site can also be red flags.

Step 3: Identify linking domains that look truly unnatural

Here, for example, you can see that the rostyleandlife.info domain:

  • Has a non-existent DR (0/100), which means it is a completely untrustworthy site.
  • Only 13 sites are linking to it, but it has backlinks to almost 230K sites, which is ridiculously unnatural.
  • It also has a very low organic traffic volume (17 clicks/month).

Basically, everything about this site screams suspicious.

So click on the “links to target” box in the row of the suspicious domain and let’s have a look at the particular page that contains the backlink.

Step 4: Check the site and the linking page

The URL and the title of the page also look unnatural – it contains a search phrase, but also an artificially generated string as you can see.

If we check out the site itself, we can see that there are only a few words of content that are clearly generated by some algorithm.

This is clearly a backlink you need to get rid of: it is on a site that Google doesn’t trust, placed in an artificially generated, extremely low-quality text block.

Step 5: Ask the webmaster to remove the backlinks

It’s not likely you get an answer, but it is worth a try because if you do, it is a much easier way than disavowing backlinks.

Step 6: Disavow the malicious backlink

You can find a detailed guide at Google on how to do this, and the process is fairly simple:

  1. Download the file containing the current backlinks as you can see in Step 1.
  2. Remove the backlinks from the file that are natural and valuable for your site.
  3. Go to the disavow links tool page.
  4. Select your website.
  5. Click Disavow links.
  6. Click Choose file.

Please note that if you check your profile again in the future and find more malicious backlinks, you have to add them to this file.

Note: Every time you upload a new file it replaces the old one, so only the links or domains contained in the most recent ones are going to be disavowed.

If you go through this process regularly (I recommend once a month) you can maintain a natural backlink profile, which is an indicator to Google that your site is truly interesting and valuable enough that people link to it – and it also indicates that you are not trying to use grey-hat or black-hat SEO techniques to increase your rankings by manipulating the algorithm.

Summary


If you have reached this point and carefully read every advice, the only task that remains for you is to keep up the work.

Note: As I have mentioned multiple times in the article: SEO is not a one-time job. It requires constant upkeep to gain and maintain organic traffic.

Now you have everything to do that: consider this a checklist that you go through every month, and I guarantee that you are going to see your organic traffic begin to climb even in the first few weeks.

And if you have any remaining questions, you know what to do: contact us, book a meeting (you can do that free of charge, naturally), and ask them. Our team is always ready and eager to help online merchants and e-commerce businesses who look to become the leaders in their niche!