Every project starts somewhere. Ours starts with a request that you, as the Client, take a few minutes to fill out our Client Questionnaire. (Well, maybe more than a few.) It’s a detailed document that allows us to better understand your history with respect to SEO and your ultimate goals. It’s delivered to you as a Word document prior to the beginning of the audit process and can usually be completed in about an hour.
Phase One: Onboarding, Site Audit, Baseline Report and Top Level Insights
Determination of Success Metrics
No project like this can be successful if the parties don’t have a complete and thorough understanding of how success is going to be measured. So as soon as the Client Questionnaire is done, the very next thing we’re going to do together us determine how we’re going to measure our collective success.
Technical / Creative Site Analysis
The Technical / Creative Site Analysis report isn’t something we do just for the Client. We need to do this anyway so that we can gain a deeper level of understanding into what is – and isn’t – working on the web site, and what parts of the current strategy might need to be changed. There are over 130 different “signals” that Google’s web search algorithm listens for and some are more important than others. At a minimum, the signals we’ll be looking at for this part of the analysis are:
- Site structure
- Server log files
- Sitemaps (XML and HTML)
- Usability / Responsiveness of the design
- Page load times / time to first meaningful paint
- Page structure (content hierarchy, Heading tags)
- Meta data (Page TITLES, META descriptions)
- Navigation, both global and local navigation, including the anchor text chosen and the structure of the linking URL
- The use of rich data tagging (schema.org, mRDF, etc)
- The file/directory naming conventions
- AMP compliance
- Robots.txt file
In addition to the items mentioned above that we’ll critique, we’ll also look at your CMS platform to see how it measures up to SEO Best Practices and how easily it can be adapted (if necessary).
The Technical / Creative Site Analysis is delivered as a PowerPoint Presentation showing screen shots of the web site, with callouts showing areas where good and bad SEO practices have been employed as well as suggestions for improvement.
Most companies will rank well for their Brand terms, so the challenge here is to find opportunities with those non-Brand terms that the Company wants to rank well for and then define and implement a strategy.
Keyword research isn’t what it used to be. It’s still all about finding opportunities but the whole process has changed since Google went to [not provided] for Google Analytics (GA) keyword data.
Unlike GA, Google Search Console (GSC) provides a wealth of information about which keywords people used to come to your web site. GWT provides information about what happened before they came to the web site (i.e. how many times your web site appeared on the SERPS for a given keyword, how many times people clicked on your link, the average position your site appeared in on the SERPs and the click through rate. GA was all about what people did after they came to the site. Even though GA now hides virtually all of the keyword data, it’s still important to look at both reports when looking for keyword opportunities.
Using some off-the-shelf tools, as well as some custom developed tools, we analyze the keyword data from both sources and present a matrix that shows the areas of greatest opportunity. We will identify the low hanging fruit – those keywords/pages that have substantial search volume and that are already ranking in the top 100 results. In these situations, with some basic optimization efforts it’s relatively easy to move pages that appear on page 8 of the SERPS to page 3 or page 4.
We’ll also identify those keywords where more effort is required to move the needle in terms of ranking. These are typically keywords/pages that appear on the first or second page of the SERPs. These positions are highly sought after and moving up from page 2 to page 1, or from the bottom of page 1 to the top of page one, is substantially more difficult. We will present the matrix and strategies for both categories of opportunities.
With most businesses, it’s easy to see who your competitors are: If you’re a Marketer, you look at the size, market cap or distribution network for companies who produce or sell what it is that you produce or sell. If you’re a Publisher, you look at other publishers who are targeting your vertical and/or demo. But on a search engine, your competitive set is going to be different. Web sites get “visibility” on search engine results pages based on how “authoritative” the search engine thinks the site is and how “relevant” the search engine thinks a page is to a search query. Often, the pages you’re competing against for visibility on the SERP are from people or companies who weren’t even on your radar screen.
Therefore, the competitive analysis begins with research to determine who the web site is really competing against.
- Competition by keyword – Looks at the top 10 web sites returned on the SERPs for each mission critical keyword. This includes a summary by list of competitive web sites by keyword and a list of how many times a competitive site appears at a given position on a search engine result page.
- Competition by LOB (Line of Business) or Subject Matter – A top-line review of how the subject web site compares against its defined (supplied by Client or Agency) competitive set. This will include a quick 30,000 foot site analysis / overview of the competitive web sites.
With all major search engines focusing heavily on two quality of the content a website provides doing a content analysis is an absolute imperative. There are two major aspects to a content analysis: Making sure the content is authoritative and focused, and making sure the content aligns with a User’s search intent.
Authority, as defined by Google, is not just about your domain’s reputation. It’s also about the reputation of the person or persons who wrote it. Who they are, the sources they cite, and what the community at large thinks of them all contribute to authority. So does the authority of any websites linking to your content. Reputation matters.
Something else that we look at as part of the Content Analysis is Search Intent. Does the content on your site align with what a User desires? Users looking to buy product or service be not engaged with your content if it’s informational, and content that is written to facilitate a transaction won’t get a lot of visibility on a search engine if the User is looking for information.
The traffic analysis is a review of where your traffic is coming from, and what visitors do after they arrive at your site. The review begins with looking at the data in your Google Analytics, Google Search Console and Bing Webmaster Tools account(s). We’ll also look at the data available from other sources like Majestic, SEMRush, AuthorityLabs, SpyFu, MOZ and others.
By looking at navigation patterns, bounce rates, number of visits vs. pages viewed per visit and time on the site, the traffic analysis is useful, along with keyword research for determining how the web site’s users are finding their experience. It’s also useful in planning what types of content should be deployed to the site in the future to secure more traffic or improve on rankings.
Note that not all projects will require a traffic analysis. Clients launching a new web site won’t have traffic to analyze, and in those situations, we do a little deeper dive into some other the other parts of the audit.
Your website cannot be completely indexed if it cannot be crawled, so part of our audit examines the crawl ability of the website. To do this, we rely on tools like Google Search Console, Screaming Frog and JetOctopus. We compare the crawl data to information gathered from other sources (like the server log files, Majestic, and Google Analytics) to understand if there are gaps and where they occur.
For example, suppose the sitemap.xml file shows that there are 600 HTML files on a web site, but the crawler, starting at the home page, is only able to crawl 500 pages. That’s a gap of 100 pages. Why were those pages not reachable? Were they intentionally not linked to other pages, or are they linked but a technical issue is preventing the crawler from discovering them.
Inbound Links Analysis
A web site that has followed SEO Best Practices for on-site natural search optimization may still not be rewarded with a higher Google PageRank® or a better position on the search engine result pages if the site hasn’t paid attention to its inbound link strategy. The rational for this is simple: Search engines want to present only the most relevant results to a user’s search query and if other web sites don’t think your site is relevant enough to link to, why should the search engines think the site is relevant?
Despite the changes in weight associated with some of Google’s algorithmic updates, inbound links to a site still carry a lot of weight with Google. It’s for this reason that a detailed analysis of the inbound links is critical to the success of any web site. Our analysis covers the source of the link, the trust and citation flow of the linking web site, the page the link is coming from, the anchor text of the inbound link and the page(s) the external web site is/are linking to.
Every web site optimization effort needs a starting point and our projects start with a baseline report which is designed to answer three questions. What did you do right, what did you do wrong, and where are the opportunities for improvement.
Using some specialized SEO reporting software, the baseline report provides a snapshot in time showing the current web site rankings across the 2 major search engines (Google, Bing) for each keyword being tracked and the page(s) that rank for that keyword. It also establishes the benchmark data for traffic from all sources and the number and source of all inbound links against which future monthly reports will be measured. (Note: We don’t track ranking results for Yahoo unless specifically asked by the Client since their search results are powered by Bing.)
Timing: Approximately 2 to 4 weeks from receipt of SEO Client Questionnaire
Phase Two: Strategy, Detailed Recommendations, Execution and QA
The execution phase is where the top level insights made during the Phase One Site Audit are flushed out into full blown strategies and implemented.
At a minimum, what you’ll get:
- TITLE tags, META Description Strategy
- Design Recommendations
- Content Development Strategy
- Navigation & Internal Site Link Strategy
- Image Optimization Strategy
- Canonical Tags Strategy
- Site Architecture Recommendations
- URL Rewrite / Redirect Recommendations
- XML Site Map Review and Submission
- Other tbd
Not every client wants to have someone else mucking around in their content management systems, so we tend to be pretty flexible here about roles and responsibilities. Sometimes we simply advise the Client; other times we jump in and make the changes ourselves. It’s up to you.
Timing: Depends on the extent of the changes and approvals required, but typically less than 4 weeks.
Phase Three: Link Development
Up until recently, the number of links a web site had was like the number of votes received in an election: all other things being equal, the web site that had the most links (votes) would be considered more authoritative and would rank higher on the SERPs (win the election).
That changed when Google released Penguin. Penguin looks at not just the number of links (and the anchor text of those links) but also the quality and relevancy of sites the links came from. But that left webmasters with a problem: since not all web sites have the same value, what do you do with low quality links coming into the web site that you can’t control? Google gave webmasters the answer when they released their “Link Disavow” tool. The Disavow tool allows webmasters to tell Google which links should be considered when determining authority and which ones shouldn’t.
So, where Link Development used to be all about acquiring links, now it’s about building a balanced link profile. To do that, we focus on three key areas:
Link Building is about gaining new links. It is a time-consuming job but in the end, it’s worth the effort because nothing says “authority” to Google like links coming in to your web site from another relevant and authoritative web site. With Google’s Penguin updates, getting a lot of links all at one time – unless they occur organically (i.e. because you did a press release) – is a sure signal to Google that the links aren’t “organic.” Meaning the links didn’t come as a result of good content but rather the links were bought or bartered. If Google detects that your web site employs this tactic, they can and will remove your entire web site from their index. Just like they did to JC Penny.
Our approach to Link Building is two-fold. First, we do some upfront research to find web sites that are relevant to your market and are themselves considered authoritative. Second, once we’ve identified those sites, we approach them about a link. Since you never know what’s going to someone, we try several different approaches. Sometimes, all you have to do is ask. This works best in a situation where someone already has content that mentions the Brand. Sometimes, we have to offer something in return for the link, like an original piece of content or an infographic, or a sneak peek at an upcoming product release that they can review. This last one has the added benefit of bolstering your social media efforts if they tweet about it.
The main thing is not to try to do too much too fast.
Link Shedding is about getting rid of the links you don’t want Google to use when developing your relevancy score. All links pass “link juice” from the site where they originate to your site, but links that originate from low quality sites will actually pass negative link juice that takes authority away from your web site.
To figure out which links to disavow, we go through all of the inbound links. Links that contain the “nofollow” modifier don’t count, since that tells Google not to consider them anyway. What we look for are links that come from web sites with a low Domain Authority score, a low mozRank and a lot of outbound links. Once we’ve identified the links we want to remove, we make a list and upload it into Google’s Link Disavow tool or provide it to your webmaster so he/she can upload it.
Link Maintenance is about keeping all the good stuff. Our reporting software alerts us when an active, high relevancy, high authority link changes to “nofollow” or becomes inactive. This gives our staff the opportunity to contact the webmaster to fix the problem. The software also alerts us to changes in anchor text.
Another part of Link Maintenance is looking for opportunities to improve existing links. Our software shows a thumbnail of the page the inbound link is coming from as well as the page it’s linking to. It also shows whether the link uses an image or has anchor text. If the link uses an image, we can contact the webmaster at the site to see about getting a text link. And if the link uses anchor text, we can often contact the webmaster to see about getting the existing anchor text replaced with other text that may be more beneficial.
Phase Four: Monthly Services
Monthly Marketing Reports
Once the baseline is established, additional ranking reports are run on a monthly basis. Monthly reports show movement – up and down – for each keyword in the report. They also show when keywords and the pages that ranked for them drop below the top 100 search engine results and when new pages enter and the pages that rank for them enter the results. This provides a useful measure of success and helps to guide ongoing refinements for the SEO strategy.
Analysis and Recommendations
All those monthly marketing reports won’t be any good without some analysis, so as part of the ongoing deliverables, we’ll be reviewing the reports and making suggestions / recommendations about adjustments to the keyword strategy and/or on page copy as may be appropriate.