Therefore the purpose of this article to make everything perfectly clear for you connected to it: how organic traffic is generated, why it is better in short- and long-term than paid traffic, how rankings affect it and what are the exact reasons if it drops.
Below you can find step-by-step guides to fix errors on your Magento site and eliminate malicious backlinks that might plague your site, among many other things.
Let’s start with the basics and proceed from there.
What is Organic Traffic
Organic traffic is a metric in online marketing, it refers to the volume of visitors who arrive on your site by clicking on organic (non-paid) search results.
It’s opposite is paid traffic, referring to visitors who land on a website by clicking on paid advertisements (e.g. in Google or on Facebook). Referred visitors (who click on links on other websites and are directed to your site) are not considered organic traffic.
Acquiring organic traffic is not free by any means – but it is a more cost-efficient and sustainable model to reach your audience.
The best way to increase the volume of organic traffic is SEO (search engine optimization) based content marketing: researching how and why members of your target audience use different search terms and creating and publishing in-depth, valuable content to answer their questions.
For this you can use:
- Skyscraper articles (long-form, 5-10K word content answering a wide range of questions regarding a well-defined niche topic).
- Expert roundups (asking questions of well-known experts and influencers in your niche to gain visibility and backlinks).
- Interviews (selecting a well-known expert in the niche who can lend credibility and visibility to your site as well as drive organic traffic and gain backlinks through citations).
- Products reviews or comparisons (targeting purchase and information-gathering intentions of your target audience).
- Original research (based on data you have about your customers, their behavior, market trends that might be interesting for your audience etc.).
- How-to articles and guides (guiding people to solve problems relevant to your product and niche step-by-step in order to gain positive emotional feedback through solving these problems for them).
- Guest posts (published on external sites in order to gain backlinks, gain visibility and reach more of your niche while directing referral traffic to your site).
You may have noticed that in the list we only included written content. Videos, infographics, podcasts and similar content can also be effective, however, they need additional written content to support them in order to rank well because this is what crawlers of Google can understand.
Then you should take the time and do keyword research, based on which you can come up with a plan for content creation.
If you take the time to do the research you can expect your content to be found by people searching for answers – and ranking better and better with time on search result pages (SERPs), driving even more organic traffic to your site by reaching more people via Google.
What is the difference between organic and direct traffic?
Direct traffic indicates the visitors that show up in Analytics as not referred to your site by other sites, search engines or other sources like email or social.
The simplest way this can happen is when a user manually enters your URL or opens your site using a bookmark. Unfortunately, however, this is not a general rule and analyzing direct traffic can get confusing.
For one, a portion of traffic that is indicated as direct in your analytics is more than likely organic traffic. Groupon tested this theory in 2014 when they de-indexed their entire site for a period of 6 hours. During this time they were not showing up in any searches, but this should have not affected direct traffic. It did.
Dark traffic makes it even more complicated to decipher its true origin.
The name is fitting: it is dark because we can’t see where it comes from. We know that a portion of what is showing up as direct is actually not, but analytics simply can’t see its source. There can be a number of reasons for this:
- Clicks in apps and software. Most apps don’t pass referring information.
- Shares in closed social groups.
- Shares via messaging.
- Email clients not referring information. Certain clients like Outlook or Thunderbird often don’t rely the source of the click.
- Clicks in password-protected areas.
- HTTPS to HTTP referrals. The protocol works in a way that links from sercure to non-secure site don’t carry over the referral information.
It can also be misleading if you don’t filter out certain types of traffic – like traffic originating from your employees.
It is important to at least have an idea about the approximate proportions because it can largely influence how you see the effectiveness of your campaigns. You can be attracting way more organic traffic or email click-throughs than you think – but maybe you don’t see it.
Paid or Organic Traffic?
Traffic brings you leads. And because of that, you need traffic that is sustainable and scalable.
Paid traffic comes from advertisements that you run on Google or Facebook, and it can be very efficient. Highly targeted campaigns can bring you hundreds of potential customers per campaign with a relatively low acquisition cost.
However, there are certain problems with paid traffic.
First: it is scalable, but your expenses will increase proportionately. Relying on ads means that you won’t be able to effectively bring down acquisition costs.
Second: it is an unsustainable model. If you stop running your campaigns the traffic simply stops.
The nice thing about properly executing an inbound strategy that focuses on gaining organic traffic is that scaling is not an option – it’s an attribute.
It takes time and money to create, publish an distribute content – but unlike an ad, it drives you traffic in the long term. A paid campaign requires that you constantly pay to keep it active. Content development, even a single piece of optimized evergreen content can attract leads for years without spending a dime on it after it is published.
That being said, it is very important to at least refresh your content constantly, but even better if you can keep publishing new and relevant content – users and Google both like sites that provide fresh knowledge.
SEM vs SEO
I can’t really answer “what is the difference between SEM and SEO”, simply because SEO is a part of SEM, or search engine marketing.
SEM, however, includes other methods like running PPC campaigns in Google (which is called paid search advertising).
As in this article we are not focusing on paid campaigns, we are using the term SEO to describe all the methods and tactics we share with you. Also, you can learn more about it in our in-depth guide, What is SEO And How it Works?
Google Algorithm Changes
Algorithm changes are fairly common at Google – usually they happen multiple times in a day. Most of these changes have little effects on the overall rankings, however, from time to time, there are some major core algorithm updates that can severely alter rankings for millions of sites.
It affected sites mostly (but not exclusively) on the health, wellness, lifestyle markets, hence the name.
The effects were huge for many sites, some seeing their rankings skyrocket, other loosing more than half of their organic traffic almost overnight.
These kinds of broad core updates are rare, but even smaller ones can affect your site because of minor optimization or technical issues for example. So if you see a sudden drop or spike in your organic traffic and rankings, be sure to check what is Google currently up to.
Change in the SERP
We are not talking about the exact rankings: Google sometimes changes up the search results pages – they may add new elements, remove or reduce advertisements, etc. For example, if you search for the “what is SEO” phrase, at the time of writing this article you get a Dictionary box, a People also ask box and a definition box with related searches on the right.
What this means is: people may search for a specific question that your site ranks at #1 for, and you may still experience a drop in traffic, simply because Google decided to answer that question (based on the results) right there on the SERP, consequently reducing the number of users who actually click through to the actual results to get an answer.
What you should do is:
- Try optimizing your content, so your pages appear not only as results, but also get featured in the boxes as the sources of definitions for example.
- Regularly check not only Google, but also sites focusing on SEO, so if there is some change you will know about it.
- Regularly check what elements are displayed on the SERPs for your most important keywords. You can do this easily in most SEO software. This is how it looks like in the Ahrefs Keyword Explorer:
As you can see for the phrase “what is SEO” there is a ton of extra elements included like an image pack and a knowledge card (the Dictionary box we mentioned). But you may also notice opportunities here: this SERP does not include a video result. So you might be able to regain some of your lost traffic by creating one aimed at this phrase.
Penalties from Google (and how to fix them)
If all else seems to be in order, but your traffic and rankings are still plummeting, there might be a chance that you have received a penalty from Google. I will assume that you haven’t tried to strengthen your site via black hat methods (check out our comprehensive SEO guide for more info on white, grey and black hat SEO), but there is still a chance that a recent algorithm update began to penalize some practices that were accepted before, or that your site seems to be growing inorganically.
A good example might be a sudden influx of backlinks, including malicious, poor quality ones (we are going to talk about how to identify malicious backlinks and how to get rid of them in detail in this article).
If last week you had ~100 links targeted at your site, but suddenly you acquire 200 more, that can easily be a red flag for Google. If there are spammy sites among those, the algorithm could judge that you may have tried to improve your rankings by buying backlinks from low-quality sites, sending in your URL to catalogs etc. In short, it seems like you are trying to trick the algorithm, which can easily lead to a penalty.
The best example for this is of course the 2012 Panda update, which quickly decimated the rankings of content farms, link farms, private blog networks and basically anyone who was heavily invested into “article marketing” – publishing very low-quality and short “posts” on hundreds of mostly irrelevant sites, just to expand their backlink profile. This was previously something that Google didn’t really penalize – until Panda, when entire poor-quality networks were thrown back from top rankings and in many cases, shut out from ranking altogether.
So, how do you check for a Google penalty?
It is fairly easy: in Google Search Console, you can find the manual penalties under Search Traffic / Manual actions. If your site has received a manual penalty, there should be a notification about it, telling you exactly what Webmaster Guideline you seem to have violated and what has been affected.
However, it may not have been a manual action that affected your rankings. You may also want to keep an eye on the following factors:
- Hacked site: no one is really ever safe from hacking, the many security breaches at the largest and technologically advanced companies in past years have proven that beyond a reasonable doubt. If you are not certain if your site was hacked, follow the instructions provided by Google on checking it. There is also a set of instructions for you on how to bring your site back to life.
- Duplicate content: the less original content you have the less interesting you are. Google prefers sites with regularly updated, high-quality, long-form original content. It might be an honest mistake on your part: forgetting to no-index automatically generated search results pages or landing page variants. We will get to these reasons and how to fix them in detail shortly. (According to the official policy of Google, content duplication does not in itself lead to penalties – but the algorithm will rank you back nonetheless.)
- Low-quality content: It’s not enough to have original content, it should also be quality content. The algorithms analyze your copy based on dozens of factors – readability, length, editing and so on. So a few dozen badly written words on all your pages will not be beneficial in any way.
- Unnatural backlink profile growth: as I have mentioned, Google prefers if your growth is natural. Now, this is a factor that you have no direct control over: if a spiteful competitor sends your URL in to a bunch of link farms just to do you harm, you can only react to that. Luckily for you, disavowing links like these is fairly easy, and we have a step-by-step guide on how to do it in this article.
- Too many redirects, broken pages: always keep an eye on the internal linking structure of your website. Make sure that if you have redirects, those are 301 redirects and are pointing to relevant sites, otherwise, Google may decide you are tricking your users by luring them in with promising something and then forcing them to see something else. Also, keep an eye on your broken pages and links: make sure that your internal linking is always up-to-date.
- Cloaking: if Google detects that you are showing different pages to the users that you are to the crawlers, you can definitely count on being penalized: this is a very basic black hat technique aimed at deceiving both humans and robots. If you have any pages that – either intentionally or accidentally, as legacy maybe – seem to be using cloaking, get rid of them right away. (You can find this out by comparing your actual site content to the versions you see at Google Search Console / Crawl / Fetch as Google.)
- First click free policy: the cloaking policy also applies to paywalled content. The “first click free” policy may have been officially removed from the guidelines, but if you show the bots of Google all of the content that your users can only get via subscription, log-in or payment, it is still a red flag.
- Keyword stuffing: it is very hard to imagine that if you are reading this article (which means you know at least something about SEO), there are keyword-stuffed pages on your site. But maybe one of your editors or developers made a mistake, published a half-done page or something. So make sure that none, absolutely none of the pages on your site are stuffed. Usually, keyword density is optimal around 0.5-2% – not that it matters that much, but a higher value may raise attention on the part of Google bots. So if you have anything looking like this, obliterate it:
- User-generated spam: at any give time hundreds of thousands of bots are looking for sites with unprotected comment sections where they can spam the links of their sites. Be sure that if you provide a way for users to interact, you protect it against bots, user-generated spam will appear as a part of your site’s content, which in turn will harm your rankings or even lead to penalties.
If you are not deliberately using black hat techniques, then most penalties you get will be easily fixable and recovery will also likely be fast.
Never take the chance to get a penalty, because it will always cost you.
For more specific optimization tips, check out our very detailed SEO case study!
Changing your domain
If you move your entire sites, you can always expect a drop in rankings, traffic, leads and anything else – simply because, even using the proper redirects, Google will need time to crawl your new site.
There is no way you can just simply move the site together with all your built to a new domain without some sacrifices. If everything is in order, Google will crawl the new site in a few weeks, and your rankings will probably slowly recover.
You can speed up the process by providing a sitemap and using 301 and 302 redirects.
In the old search console, there was an option to use the Change of address tool, but in the new search console, this is not available.
Layered navigation: too many indexed sites
Here is something that I encounter daily: indexing every single page on your site – including automatically generated pages for search results and filters.
On platforms like Magento these pages tend to multiply like there is no tomorrow: for every unique search or combination of attributes (like size, price, weight and so on) a new page is generated.
However valuable other content is on your site, it basically won’t matter, as it will be lost among them.
Let me show you an example. Using the database of BuiltWith I chose a random Magento site, Efootwear.eu.
I ran a simple search in google using the site: attribute to find out how many indexed pages they have. It turns out they have a lot:
Now, 204K indexed pages is not necessarily a sign of an error: there are sites out there that sell thousands of products to dozens of different countries. If you have ~8000 different products, like auto parts and ship to 30 different countries, you may have this many pages in different languages even without including filters.
Here this is not the case, however: Eufootwear.eu beautifully demonstrates how it could blow up the number of your indexed pages if you forget to noindex search results.
There are a large number of products on the site, so naturally, you can use different filters to find exactly what you are looking for.
The problem is: developers of the site did not noindex result pages, so now the results for every single combination of every single filter is indexed by Google:
Again, this is a huge problem for multiple reasons.
- Google sees hundreds of thousands of almost identical pages on the site with very little content and attracting 0 organic traffic, making your overall statistics much, much worse.
- Google has a limit for their crawlers – you can call it crawler budget. This determines how many sites are crawled daily on each site. If there is valuable content being published on your site but your indexing settings are not right, it might not even get noticed, because the search engine will just crawl thousands of internal search results instead.
- If they had outbound links in the footer (luckily they don’t) they would also hurt the rankings of the linked site for the same reason: the site would get hundreds of thousands of identical links from a single site, clearly signaling to Google that they are artificial.
How to fix this?
There are two basic things you have to do to avoid this.
Use canonical tags
Implement the canonical tags on all category pages (and other pages where filters might be used), so Google is going to know that further generated filter pages are variants of that page and even pass link value to the canonical category page.
This could be set up in the code of the store fairly easily. But this might not be enough if you already have a store set up with thousands of unnecessarily indexed pages. In this case, you probably have to use the Remove URLs tool in Google Search Console.
Disallow crawling in Robots.txt
The second thing you should do is disallowing the crawling of such pages in your Robots.txt.
In most cases this is fairly simple: for example if filters are appearing in the URLs with a “?” character included, simply include this line in the file:
Of course, there could be other certain things you have to noindex, like if the dir phrase appears in the URLs:
We advise that you at least consult a developer before starting to noindex your pages and remove URLs however – by simple mistake you could do more harm than good if you hide more pages from Google that you wanted.
The best solution is to ask an SEO expert what exactly the problem is and then relaying that information to your developer so they can sort it out.
The Analytics tracking code
Sometimes when Analytics tells you that your organic traffic suddenly dropped it might actually be an entirely different issue.
We have seen this many times: when the coding of the site is updated, new plugins are installed, or a complete redesign takes place, the tracking code can be unintentionally be altered or deleted.
So even if you have a major event that you can tie to the drop like a redesign, it might be the case that your traffic actually stayed the same, you just can’t see it because of this simple bug.
How to check and fix the tracking code
First, in your Analytics account, visit Admin / Tracking Info / Tracking Code.
Check the status at the bottom of the page. Here you can clearly see if Analytics is receiving data or tracking is inactive. If the status is inactive, you should check the code that is embedded on your pages by comparing it to the code displayed here – and replace it with the correct one if necessary.
If you are still having problems, check in with your developer, and make sure to check to Analytics tag setup help page.
It is a given that by 2019 more people browse the internet using their smartphones than their desktops – however, according to the latest statistics, the rapid growth halted around mid-2017.
Another obvious fact is that Google in the past years (ever since the Mobilegeddon) prefers mobile-friendly websites. (Not simply responsive ones, mind you.) But there is a fairly recent development on this end: by July 1st, 2019 Google changed the default indexing for new (previously not indexed) sites to mobile-first.
In other words, previously not being mobile-friendly only directly affected your rankings in the case of searches run on mobile – and of course indirectly by the users having a bad experience on your site if it does not load properly on their screen, bouncing from it, spending little time on your pages, etc.
(Not being mobile freindly can also lead to a lot of abandoned carts, so implementing a mobile-first strategy will definitely lower your abandonment rates.)
Getting to our topic: if your organic traffic decreased, and you notice that mainly visitors who previously arrived at your site are now lacking from the stats, you might want to check a few things:
- Has anything happened on the technical side that may have affected your mobile-friendliness?
- Has a competitor of your switch to a mobile-friendly site that now outranks yours?
- Were there any algorithm changes that might have affected your rankings because the mobile related ranking factors were strengthened?
- Check your log files: Google may have started indexing your site more often with the mobile Googlebot user agent.
- Check the Search Console: your site may have been transferred to mobile-first indexing. In that case, you should have received a notification.
In any case, we suggest regularly checking the mobile-friendliness of your site, at least by feeding your URL to the Mobile-Friendly Test and manually checking it on multiple devices, especially after technical updates to the site.
And if you need just a little more motivation to finally switch: by 2018, still only around half of the websites globally were mobile-friendly. Which means that you might be late, but you will still be ahead approximately half of your competitors.
Competitors overtaking your rankings
A drop in your organic traffic might be caused by one or more of your competitors overtaking your previously good rankings on the search result pages.
First, you should regularly check how your site is doing in terms of ranking for the most important keywords. With Ahrefs you can do this quite simply: if you analyze a site in the SIte Explorer feature, pick the Movements option under Organic Search.
Here you can see all the organic keywords where you lost or gained positions by date, complete with estimates about how this could affect your organic traffic. For example this is what the ranking movements for Ahrefs looked like in the first days of June 2019:
As you can see, there were two significant losses they suffered (maybe due to the Google core update taking place at the time): they fell back from first to fourth place for the “most searched on google” phrase, and to the second place for “keyword search”. With these changes, they lost 2000+ organic visitors per month.
We can look at the backlinks referring to the individual pages: if there are a large number of lost backlinks in the days before loosing positions with few gained, this could be a clear indicator, and it is not necessarily a sign of your competitor doing anything: in this case you only rank worse because your site was weakened while theirs were not.
So have a look at the New and Lost options under Backlink Profile:
We can see that the ahrefs.com/blog/top-google-searches/ page has gained five backlinks in the days prior, with only two of them carrying some notable value, but even those are relatively weak (with Domain Ratings of 39 and 36).
Slightly more backlinks were lost, but we can also see that these were also valuable, possibly even malicious backlinks (have a look at the chapter we dedicated to the latter). So likely a change in the backlink profile for the page was not the reason for the change in raking for the given keyword.
So let’s check out the current SERP positions for the keyword itself to see what pages began to rank better. Simply click on the keyword in the list in the Recent Changes list and go to the Serp position history block.
In this case, it immediately becomes clear what happened: two of Google’s own pages began ranking better, which is not really a shocking development.
If this wasn’t the case, and one of the competitors overtook the positions, what you have to do is look at the backlink profile of the page that is now raking better. This way you can see what pages are referring to your competitors – giving you possible targets for your own link building.
You can also check the pages to see if there were any changes recently – updated content, optimized site framework, etc. If you are not closely monitoring the sites and pages of your competitors, I suggest using the Wayback Machine to see what might have changed.
A lack of fresh content
The freshness of your content is an important ranking factor in the Google algorithm, and it is not hard to see why. These days, everything is about providing the best possible results with the highest quality for users of the search engine.
This is not exactly a new development: the Google Freshness update was announced all the way back in November 2011, and it affected a third of all indeed pages – so it was a pretty decisive step.
This doesn’t necessarily mean that the fresher content will always rank better, there are hundreds of other factors to consier. But overall as a rule of thumb, we can safely say that sites with regularly updated content do rank better.
If your site is static, if you publish a few pages and articles and just leave it there, users and bot alike will, after a time wonder i the infomration is up to date, if your site is still funcional.
So, how can you keep your site and content fresh?
- Create and use a content calendar. Planning your content creation for months ahead helps to keep up the pace, organize your ideas, allocate tasks and serve your auidence better.
- Decide what kind of content you want publish – this will greatly affect the rate at which you can create them. An in-depth, 5000+ word article can easily take up to two weeks to realize at least – including research, copywriting, illustrations and so on. Shorter posts could be done in a few hours.
- Refresh your older articles. Basically any in-depth content can be updated over time with new statistics, new solutions and methods, case studies and so on. Regularly check up on your older content and make the changes so they can bring you traffic as evergreen content.
- Reflect on recent events, news, developments in your niche. It makes sense that if you position yourself as an industry expert, you will have opinions about the events froming that industry. This can be agood base for up-to-date content, even newsjacking if you are fast and creative enough.
Your existing contetn might still be valuable and relevant, and for months or years, it may still bring you traffic, even in increasing volume.
But sooner or later new content, more in-depth articles will take them over on the result pages, not just because users and bots will not be sure how up to date your content really is, but also because your competitors will also be optimizing an consciously try to outrank you.
Your rankings can see a drop if there are malicious backlinks pointing to your site. This can happen in a number of ways – most often because of artificially generated low-quality backlinks or maybe because a competitor is trying to harm your backlink profile.
This is one of the reasons why it is important to constantly check your backlink profile, so you notice these kinds of malicious backlinks and can get rid of them.
I’ll walk you through the process step by step:
Step 1: Check your backlink profile
There are multiple ways to do this, the simplest is probably if you do it on Google, but not the best. This is how you can do it in the Search Console as per Google:
- Choose the site you want on the Search Console home page.
- On the Dashboard, click Search Traffic, and then click Links to Your Site.
- Under Who links the most, click More.
- Click Download more sample links. If you click Download latest links, you’ll see dates as well.
You can find more info here.
For this our favorite is Ahrefs. To show you the process we are going to use one of our clients, CanadaHair as an example.
First, open up the Site Explorer tool and click on Referring domains to see which sites are linking to yours.
Step 2: Filter domains that are low-quality or outright suspicious
Usually, these are the ones with a very low Domain Rating, but a very low organic traffic or an unusually high amount of backlinks pointing to your site can also be red flags.
Step 3: Identify linking domains that look truly unnatural
Here, for example, you can see that the rostyleandlife.info domain:
- Has a non-existent DR (0/100), which means it is a completely untrustworthy site.
- Only 13 sites are linking to it, but it has backlinks to almost 230K sites, which is ridiculously unnatural.
- It also has a very low organic traffic volume (17 clicks/month).
Basically, everything about this site screams suspicious.
So click on the “links to target” box in the row of the suspicious domain and let’s have a look at the particular page that contains the backlink.
Step 4: Check the site and the linking page
The URL and the title of the page also look unnatural – it contains a search phrase, but also an artificially generated string as you can see.
If we check out the site itself, we can see that there are only a few words of content that are clearly generated by some algorithm.
This is clearly a backlink you need to get rid of: it is on a site that Google doesn’t trust, placed in an artificially generated, extremely low-quality text block.
Step 5: Ask the webmaster to remove the backlinks
It’s not likely you get an answer, but it is worth a try because if you do, it is a much easier way than disavowing backlinks.
Step 6: Disavow the malicious backlink
You can find a detailed guide at Google on how to do this, and the process is fairly simple:
- Download the file containing the current backlinks as you can see in Step 1.
- Remove the backlinks from the file that are natural and valuable for your site.
- Go to the disavow links tool page.
- Select your website.
- Click Disavow links.
- Click Choose file.
Please note that if you check your profile again in the future and find more malicious backlinks, you have to add them to this file.
If you go through this process regularly (I recommend once a month) you can maintain a natural backlink profile, which is an indicator to Google that your site is truly interesting and valuable enough that people link to it – and it also indicates that you are not trying to use grey-hat or black-hat SEO techniques to increase your rankings by manipulating the algorithm.
Identify who is missing (look at the segment)
If you have traffic missing, the cause might not be a general problem. It is possible that you just lost the attention of one or more specific segments for some reason. So it is important to keep an eye on the key audience metrics in analytics like demographic data (age, gender), language, location, the platforms they use to access their site and so on.
Different segments tend to have different problems, different rationalizations, needs – and also tend to search differently.
A good example may be if you do a migration of your store and some of the product data gets left out in the process on the new product pages, but you still have the copy for them.
Of course, this will likely be accompanied by very noticeable movements in your ranking keywords. So be sure to cross-reference your Analytics and rankings data.
On how to check your ranking movement, check the Competitors overtaking your ranking section.
If you have the exact segment you are losing and have a general idea about what ranking changes are causing this, it will be easier to correct this – either by improving your product pages, general website copy or content.
If you have reached this point and carefully read every advice, the only task that remains for you is to keep up the work.
Now you have everything to do that: consider this a checklist that you go through every month, and I guarantee that you are going to see your organic traffic begin to climb even in the first few weeks.
And if you have any remaining questions, you know what to do: contact us, book a meeting (you can do that free of charge, naturally), and ask them. Our team is always ready and eager to help online merchants and e-commerce businesses who look to become the leaders in their niche!