Welcome to the 3rd annual What’s in my SEO Toolbox post. SEO is always interesting. It’s always changing. But 2019 has been a particularly interesting year with respect to SEO. How? Well besides the typical algorithmic changes by the search engines, Google switched from desktop indexing to mobile indexing. And then there’s voice search (Alexa, Google Home) which is becoming more and more important to having a successful SEO campaign. And there’s also a tighter and tighter integration with social media and how it can be leveraged to improve SEO. So once again, we SEOs are constantly having to re-evaluate the tools we use to do our jobs.
So without further ado, here is my SEO toolbox for 2019.
Web Crawling
Last year, I added Deep Crawl to to my toolbox in addition to Screaming Frog. Because Deep Crawl is cloud based and can handle any size web site I kinda had to have it. But Screaming Frog has moved back into my number one position.
Top Prize: Screaming Frog
As of this writing, the most recent update to Screaming Frog is version 10.4 which came out in November 2018 and addressed some of the bugs that were present in the big September release. Between the time I wrote my post in January of 2018 and today, Screaming Frog’s engineers added a host of great new features: Database Storage Mode (Scale), In-App Memory Allocation, Store & View HTML & Rendered HTML, Custom HTTP Headers, XML Sitemap Improvements, Granular Search Functionality, Updated SERP Snippet Emulator and Post Crawl API Requests, Scheduling, Command Line Interface, Indexability & Indexability Status, XML Sitemap Crawl Integration, Internal Link Score, Post Crawl Analysis, Visualizations , AMP Crawling & Validation, Canonicals & Pagination Tabs & Filters and Improved Redirect & Canonical Chain Reports. Yikes! Add to that the ability to pull in data from Google Analytics, Google Search Console, MOZ, Ahrefs and Majestic, and the ability to include or exclude specific files, folders directories, and the export tool, and you have a really, really powerful tool at a really, really great price.
So what features am I most fond of?
Scheduling for one. Enter a name for the project, name the task, add your options and where you want the files to go when SF is done with them. Then hit OK and get some coffee. Or lunch.
Visualization is another. Screaming Frog’s engineers really outdid themselves with this new addition, and there are so many good things to like about it that I don’t have the room here to go into them. So I’m going to pick two of my favorites. Both of these are courtesy of Screaming Frog.
There’s the Force Directed Crawl. This one is looking at the highest value pages color-coded by link score but you can do so much more with this tool, it’s scary.
Here’s another favorite. This one is a word cloud tool. The first screen capture looks at the inbound anchor text in cloud form.
And this one looks at body text as a word cloud.
So if you want to show your Clients what the focus of a specific web page is, there you go.
Even though you can do memory allocation from within the tool, my biggest complaint with it was and continues to be, that SF takes a lot of memory if you want to crawl a big site (or if you want to crawl a smaller site but render all the pages along with the crawl.) You need to have a lot of memory on your system and if you’re like me and work off a laptop, you’re kind of maxed out at about 16GB. But what if you need more? I found a way around it: I set up an instance on AWS and loaded Screaming Frog onto it. Now, I have no problems with memory or hard drive space. The only downside of this application is that you have to remember to log into AWS and cancel the instance when the work is done. If you don’t, plan on getting some additional (and possibly unexpected) bills from AWS for the service.
There is so much here that I’m going to do a separate blog post really soon. In the meantime, you can download Screaming Frog here.
Runner Up: Deep Crawl
Deep Crawl does everything you could want from a crawler. Want to crawl just the top-level domain? Check. How about all of the subdomains? Yup. Compare your crawl data to your sitemaps, your Google Analytics data, and your Google Search Console Data? Check, check and check. What about backlinks? Absolutely. You can import backlink data from Google Search Console or Majestic or just about any tool that exports into a .csv format. Do you have 10 sites you need crawl on the 1st of every month? No problem. Just set up the scheduling and push the Save button. The magic happens without you having to watch anything, and you’ll get an email and a browser alert when it’s done. Oh, and this is my favorite. Deep Crawl does server log files.
The main dashboard (below) is clean and gives you a high-level overview of all the important categories. Click on any item anywhere on the dashboard and you can drill down for more information. The left-hand navigation is organized into big buckets. Click on any one of them to expand the bucket and see more reports.
One of the most informative reports is the Crawl Source Gap Analysis (below). This report shows how the number of URLs from the web crawl crawl found or not found for each other source. In the chart below, Deep Crawl shows that there were 402,262 URLs found in a web crawl (far left). That’s great, but there are an additional 115,350 URLs found via other sources that weren’t in the web crawl, meaning they’re “orphaned” URLs. The far right data point shows that 210,285 URLs from the web crawl were also in the log file, but 307,327 URLs were not – meaning that they weren’t hit by any of the search engine bots.
The downside of Deep Crawl is the cost. In order to crawl 200,000 URLs a month, it’s going to cost you $139 USD if you pay by the month. If you can pay for the year all at one time, that same number of URLs will cost you $1,199 . That’s a big number when compared to Screaming Frog SEO Spider’s annual license fee of $190.07 (at current exchange rates). Still, if you need the horsepower, there is no better dedicated web crawler.
Log File Analysis
It’s just my opinion, but not enough SEOs do log file analysis. But if you’re looking to dive really deep into your site’s performance, you can’t just rely on Google Search Console’s crawl stats reports. They’re not complete, and in some respects, the data is a bit misleading. For example, GSC’s shows the number of pages crawled per day by Googlebot. But those aren’t unique pages. In GSC, a single page that Googlebot crawls 5 times a day will show up as 5 pages crawled. You can’t see the frequency with which the crawled hits pages and you can’t use the GSC data to figure out which pages Googlebot thinks are most important.
To really see what’s going on, you need to look at the server logs.
Screaming Frog’s Log Analyzer allows you to upload your log files, identify crawled URLs and analyze search bot data and behavior for invaluable SEO insight. So, what can you do with it? You can:
- Identify crawled and uncrawled URLs
- Find Broken Links
- Get a complete list of 4XX errors
- Discover Crawl Frequency
- Audit Redirects
- Identify Large and Slow Pages
- Improve Your Crawl Budget
Again, this tool isn’t free. But if you really want to get into the weeds on your SEO, you need it. Download it here.
Rank Tracking
I love STAT. STAT does not connect to GA or GSC. You have to do some legwork and create a keyword list. But once you give STAT that list, it will go out and collect a ton of data from actual SERPs. The basic dashboard (below) shows how many keywords you’re tracking, your average rank, share of voice across the keywords you’re tracking, daily and 30-day trends and more.
Beyond that, you can see whether your site appears in carousels, news listings, knowledge panels, answer boxes, people also ask boxes, related questions boxes, maps, images, videos and organic lists at the keyword level. In addition, you can look at where you rank at the keyword level on mobile SERPs vs desktop SERPs. Not just for you, but also for your competition. And you can also schedule reports and alerts anytime keywords change position or visibility.
All this information comes at a price. The cheapest plan STAT offers is $600 a month and that’s a lot of money to pay out every month. You do get a dedicated account manager, a tech support department that responds in 24 hours, and daily ranking data. Sign up here. But if you’re looking for something more expansive – more like an Enterprise SEO tool that also does rank tracking – check out Raven Tools.
Keyword Research
Market Samurai is my stealth tool. Not enough people know about it and that’s a shame. It’s a damn good tool. I use it for two specific tasks: Keyword Research and Competitive Research.
Keyword research couldn’t be easier with this tool. Just create a project, add your keywords (one per tab) and Market Samurai will tell you all of the variations on that term as well as the number of searches (daily, weekly or monthly), the number of competitors you’re going to be up against and the amount of money you might be expected to pay for each click in a paid search campaign. Also helpful is the graph showing the 12 month trend for searches on that keyword.
There is a free version of Download Market Samurai as well as a paid version. Your choice.
Paid Search, Competitive Research and Algorithm Volatility
SEMRush does so many things and does them so well, that it’s difficult to know where to start. It’s the Swiss Army Knife of SEO.
I use it for organic keyword research, paid search research, domain research, competitive research, traffic analysis, ranking data, brand monitoring and site analysis. I also use it for some backlink research (though my favorite tool there is Majestic).
SEMRush a great tool to use when you don’t have access to the web sites actual analytics. What do I like most…? I’d have to say it’s the UI. It’s brilliant. Lots of data displayed intelligently with great visuals. Click on any section, and you can drill down into the details.
One more thing. Anyone who deals with optimizing a web site for Google’s SERPs knows that Google is constantly tweaking their algorithm. Keeping track of what’s going on is a constant challenge. There are lots of tools to do this but my go-to choice is SEMRush. You can view the data by category or by device type. It’s accurate and best of all, it’s easy to understand. For you and your Clients.
You can find SEMRush here.
Backlink Stuff
Whether you’re trying to build links to your site, or you’re managing the link profile for a client and trying to decide which links to disavow, there’s only one tool to consider: Majestic. Period.
Majestic’s tool is the bomb. The link database is the biggest out there (really) and it’s updated constantly. As of this writing, the Fresh Index has crawled 362,358,798,742 URLs and discovered 854,172,538,158 links. The Historic Index contains 6,659,283,985,220 links. And all links (and domains) are analyzed for topicality and authority.
Majestic’s toolset is divided into several areas: The Site Explorer lets you explore a domain/url in great detail. The backlink checker is designed to very quickly check backlink and referring domain counts for the top 10 billion most-linked to URLs in the database, all subdomains and root domains present in the database. The Keyword Checker shows how often Keywords or Key Phrases appear in the indices. You can expand the list by adding multiple phrases, allowing you to compare how much interest and competition surrounds a set of Keywords or Key Phrases. While most keyword tools rely on paid search data, Majestic data is based primarily on organic data. And the Neighborhood Checker can check to see if all of a site’s links are coming from the same server or server farm.
You can sign up for Majestic here.
Paid Search
I like SpyFu for this. It’s a versatile tool that provides a wealth of information (organic, paid, etc) on what’s going on with your competition. But what I use SpyFu for more than anything else is its paid search marketing information. SpyFu will not only tell me what keywords are being targeted for paid search ads, but it will tell me when the ads ran, what position they ran in, and what percentage of ads were served for every campaign. Why is that valuable? Because I can see at a glance what’s working and what isn’t.
You can get SpyFu here (https://www.spyfu.com/).
There are a lot more tools that I like and work with but these are the SEO tools I can’t live without.