It’s been nearly a year since I wrote the original version of this post. The one constant is that nothing in SEO is constant. Google updates it’s algorithms; they update their quality guidelines; they add and remove features on their SERPs. All of that means we are constantly having to re-evaluate the tools we use to do our jobs.
So without further ado, here is my SEO toolbox for 2018.
Web Crawling
I did not remove last years choice of web crawler (Screaming Frog SEO Spider) from my toolbox. I added another one: Deep Crawl. I could not do my job without it. Deep Crawl is cloud based and can handle any size web site. One million pages? Easy. Five million pages? Hey, this program is barely breaking a sweat.
Deep Crawl does everything you could want from a crawler. Want to crawl just the top level domain? Check. How about all of the subdomains? Yup. Compare your crawl data to your sitemaps, your Google Analytics data, and your Google Search Console Data? Check, check and check. What about backlinks? Absolutely. You can import backlink data from Google Search Console or Majestic or just about any tool that exports into a .csv format. Do you have 10 sites you need crawl on the 1st of every month? No problem. Just set up the scheduling and push the Save button. The magic happens without you having to watch anything, and you’ll get an email and a browser alert when it’s done. Oh, and this is my favorite. Deep Crawl does server log files.
The main dashboard (below) is clean and gives you a high level overview of all the important categories. Click on any item anywhere on the dashboard and you can drill down for more information. The left hand navigation is organized into big buckets. Click on any one of them to expand the bucket and see more reports.
One of the most informative reports is the Crawl Source Gap Analysis (below). This report shows how the number of URLs from the web crawl crawl found or not found for each other source. In the chart below, Deep Crawl shows that there were 402,262 URLs found in a web crawl (far left). That’s great, but there are an additional 115,350 URLs found via other sources that weren’t in the web crawl, meaning they’re “orphaned” URLs. The far right data point shows that 210,285 URLs from the web crawl were also in the log file, but 307,327 URLs were not – meaning that they weren’t hit by any of the search engine bots.
The downside of Deep Crawl is the cost. In order to crawl 1.5 million URLs a month, it’s going to cost you $499 USD if you pay by the month. That works out to $5,988 a year. If you can pay for the year all at one time, that same number of URLs will cost you $5,489 – a savings of about $500. That’s a big number when compared to Screaming Frog SEO Spider’s annual license fee of $203 (at current exchange rates). Still, if you need the horsepower, there is no better dedicated web crawler.
As I mentioned, Screaming Frog SEO Spider – my last years’s toolbox choice – it’s still in my toolbox. Screaming Frog won’t run on a cloud, so you’ll need a really powerful machine with a lot of memory. I run SF on a Dell Workstation with 2 quad core Xeon processors and 24 GB of ram and 600K URLs is about where the system starts to chug. SF requires Java to run, and occasionally, the program will be running along and then just crash without warning. To be fair, it’s almost invariably due to me not upgrading to the latest version of Java. While you can run multiple instances of SF at the same time on the same desktop, I don’t recommend it.
But there’s a lot of goodness here. The UI was recently updated. And the team at SF added the ability to pull in data from MOZ, Ahrefs and Majestic. And of course you can still export all your data into Excel for additional analysis. Really, really good stuff.
You can download it here.
Log File Analysis
It’s just my opinion, but not enough SEOs do log file analysis. But if you’re looking to dive really deep into your site’s performance, you can’t just rely on Google Search Console’s crawl stats reports. They’re not complete, and in some respects, the data is a bit misleading. For example, GSC’s shows the number of pages crawled per day by Googlebot. But those aren’t unique pages. In GSC, a single page that Googlebot crawls 5 times a day will show up as 5 pages crawled. You can’t see the frequency with which the crawled hits pages and you can’t use the GSC data to figure out which pages Googlebot thinks are most important.
To really see what’s going on, you need to look at the server logs.
Screaming Frog’s Log Analyzer allows you to upload your log files, identify crawled URLs and analyze search bot data and behavior for invaluable SEO insight. So, what can you do with it? You can:
- Identify crawled and uncrawled URLs
- Find Broken Links
- Get a complete list of 4XX errors
- Discover Crawl Frequency
- Audit Redirects
- Identify Large and Slow Pages
- Improve Your Crawl Budget
Again, this tool isn’t free. But if you really want to get into the weeds on your SEO, you need it. Download it here.
Rank Tracking
This one is a switch. Last year, I was using Authority Labs to do rank tracking. This year, I’ve switched over to STAT.
Ever since Google removed keyword information from Google Analytics, it’s been harder and harder to see which keywords drive traffic to individual pages and where those pages appear on the SERPs for each of those keywords. STAT goes a long way towards replacing that information.
Unlike Authority Labs, STAT does not connect to GA or GSC. You have to do some legwork and create a keyword list. But once you give STAT that list, it will go out and collect a ton of data from actual SERPs. The basic dashboard (below) shows how many keywords you’re tracking, your average rank, share of voice across the keywords you’re tracking, daily and 30-day trends and more.
Beyond that, you can see whether your site appears in carousels, news listings, knowledge panels, answer boxes, people also ask boxes, related questions boxes, maps, images, videos and organic lists at the keyword level. In addition, you can look at where you rank at the keyword level on mobile SERPs vs desktop SERPs. Not just for you, but also for your competition. And you can also schedule reports and alerts anytime keywords change position or visibility.
All this information comes at a price. The cheapest plan STAT offers is $600 a month. But for that, you get a dedicated account manager, a tech support department that responses in at most (in my experience) 24 hours, and daily ranking data. Sign up here.
Keyword Research
This hasn’t changed from last year. I still love Market Samurai and I use it for two specific tasks. Keyword Research and Competitive Research.
Keyword research couldn’t be easier with this tool. Just create a project, add your keywords (one per tab) and Market Samurai will tell you all of the variations on that term as well as the number of searches (daily, weekly or monthly), the number of competitors you’re going to be up against and the amount of money you might be expected to pay for each click in a paid search campaign. Also helpful is the graph showing the 12 month trend for searches on that keyword.
There is a free version of Download Market Samurai as well as a paid version. Your choice.
Social Media Monitoring
Let me say this upfront. There is no evidence that any direct causal relationship exists between likes or retweets or pins and organic search position. But having said that, there are correlations between social media mentions and organic search positions. Why? Because the more visibility your Company or your Brand has on social media the more likely someone is to click on your content when it appears on a SERP. For this reason, I always advise my Clients to coordinate their social media and organic search campaigns. And the tool I use when it comes to monitoring social mentions and correlating that information to my organic traffic is Nuvi.
There are so many ways to use this tool that it’s scary. OK, maybe not scary. But it’s overwhelming. You can monitor all of your social media platforms in one analytics package. You can track hashtags and see when and where you get traction. You can see who your key influencers are and contact them directly from the app. You can produce fully interactive reports with live data. Here’s a sample screenshot from a couple of years ago when the Broncos were in the Super Bowl:
Nuvi is another paid access app. No download is required, but you will have to fill out a form and go through their sales pitch before you can see anything live.
Sign up here: https://www.nuvi.com/
Paid Search, Competitive Research and Algorithm Volatility
SEMRush does so many things, and does them so well, that it’s difficult to know where to start. It’s the Swiss Army Knife of SEO.
I use it for organic keyword research, paid search research, domain research, competitive research, traffic analysis, ranking data, brand monitoring and site analysis. I also use it for some backlink research (though my favorite tool there is Majestic).
SEMRush a great tool to use when you don’t have access to the web sites actual analytics.What do I like most…? I’d have to say it’s the UI. It’s brilliant. Lots of data, displayed intelligently with great visuals. Click on any section, and you can drill down into the details.
One more thing. Anyone who deals with optimizing a web site for Google’s SERPs knows that Google is constantly tweaking their algorithm. Keeping track of what’s going on is a constant challenge. There are lots of tools to do this but my go-to choice is SEMRush. You can view the data by category or by device type. It’s accurate and best of all, it’s easy to understand. For you and your Clients.
You can find SEMRush here.
Backlink Stuff
Whether you’re trying to build links to your site, or you’re managing the link profile for a client and trying to decide which links to disavow, there’s only one tool to consider: Majestic. Period.
Majestic’s tool is the bomb. The link database is the biggest out there (really) and it’s updated constantly. As of this writing, the Fresh Index has crawled 362,358,798,742 URLs and discovered 854,172,538,158 links. The Historic Index contains 6,659,283,985,220 links. And all links (and domains) are analyzed for topicality and authority.
Majestic’s toolset is divided into several areas: The Site Explorer lets you explore a domain/url in great detail. The backlink checker is designed to very quickly check backlink and referring domain counts for the top 10 billion most-linked to URLs in the database, all subdomains and root domains present in the database. The Keyword Checker shows how often Keywords or Key Phrases appear in the indices. You can expand the list by adding multiple phrases, allowing you to compare how much interest and competition surrounds a set of Keywords or Key Phrases. While most keyword tools rely on paid search data, Majestic data is based primarily on organic data. And the Neighborhood Checker can check to see if all of a site’s links are coming from the same server or server farm.
You can sign up for Majestic here.
Paid Search
I like SpyFu for this. It’s a versatile tool that provides a wealth of information (organic, paid, etc) on what’s going on with your competition. But what I use SpyFu for more than anything else is its paid search marketing information. SpyFu will not only tell me what keywords are being targeted for paid search ads, but it will tell me when the ads ran, what position they ran in, and what percentage of ads were served for every campaign. Why is that valuable? Because I can see at a glance what’s working and what isn’t.
You can get SpyFu here (https://www.spyfu.com/).
There are a lot more tools that I like and work with but these are the SEO tools I can’t live without.