It’s the situation that every webmaster or SEO person dreads, they login to their Google Analytics account only to see that their traffic has nose dived into oblivion. Now is not the time for panicking though! Websites and how they acquire traffic are a lot more complex than they used to be so finding the source of the problem needs to be tackled correctly and methodically, here’s a short guide on some of the main areas you should be investigating.
Look Closely At Traffic Source Changes
First thing you want to do when you see the traffic dip is to find out the exact source, go to Acquisition > Overview and check all your traffic sources, if its an SEO issue then your organic traffic will be down. If you have a lot of referral traffic from other sites or social media platforms and this is down then make sure any ads or links you have there are still up and running.
Inspect On Page Elements
Is the title tag still the same and all the other meta information correct? For high traffic/highly competitive terms changing an optimized title tag can cause a page or site to plummet from page one of the rankings and all your organic traffic will go with it. Other things to look out for are noindex tags and nofollow tags for your inner page links. If all the links to your inner pages are no followed then Google bots can not follow these links and they will lose their rankings. H1 & H2 tags are also very important and something as simple as removing a target keyword from these can cause a page to suffer.
Login to Google Webmaster Tools
Here you need to check your site messages, if you have received a manual Google penalty you should have a message here, note that many Google penalties do not come with a warning though e.g. Panda. Then check the following:
Index status – here you can see how many pages on your site Google has indexed, a sudden drop in these means there is a problem.
Crawl errors – if your site has experienced any down time or has too many 404 pages this will have a major affect on not just how Google sees your site but can also cause users to leave and not come back!
Robots.txt status – just like with on-page elements the robots.txt file can be very powerful and can cause a huge amount of problems if not setup correctly. Make sure that the Googlebot is allowed to crawl the correct pages and parts of your site using the tester tool. Here’s a fantastic guide from YOAST on how to correctly configure a robots file.
Check Any Page Redirects
If you had any pages setup to redirect elsewhere then make sure all those 301s are still in place, if any have them have been removed or broke then all the traffic going to those pages is going to see a big, fat 404 page, too many of these and Google will severely penalize your site!
These are just some of the main things you should do as your first port of call when investigating a traffic drop but there are many other things ranging from canonical tags being removed to footer and navigation links breaking that can cause your site to crash. Most if not all of these problems are brought on by humans making changes to your site which is why its crucial to keep a log of all changes and edits you or your team make to your website.
Author: David Jones
David is lead digital marketer at Performancing, he also blogs at Bloggingtips.com and Bloggingpro.com.