Skip to Content
F15762d376fffc424fa0b6870e83333f bokeh cover bg

SEO Audit

The purpose of an SEO Audit is to paint an overall picture of what you’re doing right, what needs to be improved, and what issues may be hurting your rankings. If you’re familiar with SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats,) that’s exactly how you should approach the SEO Audit.

Start by signing up for a Webmaster Account with each of the major Search Engines. These are excellent tools to test your SEO weaknesses.

Google Webmaster Tools (free)

  1. Go to Google Webmaster Tools
  2. Create a Google account if you don’t already have one
  3. Verify your site
  4. Set your correct geographical target
  5. Set your preferred domain
  6. Enable image search (may drive traffic, but will unlikely be useful)
  7. Create and upload XML sitemaps
  8. Identify issues
  9. Look at your external links
  10. Look at the GoogleBot crawl rate
  11. Look at the search phrases
  12. Be alerted to duplicate titles

Bing Webmaster Tools (free) Sign up and authenticate your site. You’ll need to sign up for a Windows Live ID if you don’t already have one.

Yahoo Site Explorer (free) Similar – sign up and authenticate your site. You’ll need to sign up for a Yahoo! account if you don’t already have one.

Usability Review

1. Evaluate the visual design. Navigate your site through the eyes of a visitor. Is the site aesthetically appealing? What type of user experience does your site offer? Is there a logical hierarchy between pages and subpages? Is there a comfortable flow between content areas? Does your site look like it was made in 1999 or 2009?

If the visual design is driving visitors away, then no amount of SEO efforts will help increase traffic and conversions. If your web site is in rough shape or built entirely in Flash, now is the best time to redesign it.

2. Check browser compatibility. When designing and optimizing a site it’s important to see how your site renders in operating systems (e.g. Windows 7 or Mac OSX) and browsers (e.g. Firefox, Safari, or IE7) other than your own. Browser compatibility has a huge impact on usability. I suggest using Browser shots and/or NetRenderer.

3. Custom 404 page. Generally if a user reaches a 404 page (”page not found”) they’ll bounce off your site. Improve your visitor retention by customizing your 404 page. At the very least, it should include an apology, a prominent search box, a link to your sitemap, a link to your homepage, and your standard navigation bar. Add in a funny picture and some self-deprecating humor and you may just keep that visitor.

Accessibility / Spider-ability Review

1. Look through the eyes of a SE spider. Run your site through the YellowPipe Lynx Viewer to see what your site looks like from a search engine spider’s perspective.

2. Turn off Flash, Javascript, and cookies on your browser and see if you can still navigate your site.

3. Look at the Robots.txt file. If you’re using a Robots.txt file make sure you aren’t excluding important sections of content. A misplaced Robots.txt file can prevent whole sections of your site from being indexed. Use the Robots.txt checker in Google’s Webmaster Tools.

4. Navigation location. Your primary navigation should be located at the top or top-left of your site and appear consistently at the same place on every page. Make sure your navigation links to all major content areas of your site and includes a link back to your homepage.

5. Javascript in navigation = Bad! When Javascript is embedded inside of navigation elements (e.g. to create a drop down menu) it renders the links in the menu invisible to search engine spiders. This is also a cross-browser compatibility issue, and your fancy drop down menu will likely break on some browsers. Don’t use any Javascript or Flash navigation, but if for some reason you need to (really?) then be sure to have the same menu elements appear in an HTML-only navigation bar in your footer.

6. Breadcrumbs. Use ‘em. Not only do breadcrumbs show visitors where in the site hierarchy the current page is located and provide a shortcut to other pages in the hierarchy, but optimizing your Breadcrumbs with keywords will help your SEO efforts.

7. Frames. Don’t use them.

8. Splash page. Don’t use one.

9. Flash. Flash is ok in small doses, but if the site built entirely in flash then just give up now; don’t even try to optimize the site. You won’t be able to rank competitively until you do a top-down redesign.

10. Clicks-from-homepage. Every page on the site should be accessible from the homepage within 3 or 4 clicks, or it becomes harder to find for visitors and risks being ignored by Spiders.

11. Restricted access. Are there any pages on the site that (1) require a login or are accessible only via (2) a search box or (3) a select form + submit button? These pages will not be indexed by Spiders. That’s not a problem unless there’s restricted content that you want the spiders to index.

12. Broken Links. Use a tool such as the W3C Link Checker to check for broken links. Pages linked to by broken links might not be accessible to SE spiders, and search engines may penalize sites with an abundance of broken links.

Google Health Check

1. Site: command search. Do a Google search for “site:www.example.com” to see how many of your pages have been indexed by Google. This data is notoriously inaccurate, so don’t expect it to match up perfectly, but consider it a red flag if you know you have 2,000 pages of content and Google says you have 27. Record the number of indexed pages

2. Do a brand search. Google your company / website name. Even if you aren’t ranking well for keywords yet, you should rank #1 for your domain / brand.

URL Review

1. URLs should be succinct, descriptive, contain relevant keywords, and leave out non-essential words. For example, Matt Cutts (the head of Google’s WebSpam team) used the page title “best-iphone-application” for a post titled “What are the best iPhone applications.”

2. Drop Sessions ID’s from URLs They’re ugly, they create duplicate content issues, and pose security risks.

3. No more than 2 or 3 query parameters (e.g. “?” or “=” or “&”)

Bad = http://www.mysite.com/brands.p... = http://www.mysite.com/brands.p... depth: While the number of slashes in your URLs shouldn’t have a negative effect on your SEO ranking, best practice is to keep

Bad = http://www.mysite.com/people/p... = http://www.mysite.com/people/n... hyphens in page names, not underscores

Bad = www.thedailyanchor.com/this_is... = www.thedailyanchor.com/this-is-my-post/

Redirect Review

1. 302 Redirects. Do a server header check using the W3C Link Checker again or the Live HTTP Headers add-on for Firefox and look for any 302 redirects. If you find any, replace them with 301 redirects. 302 redirects won’t transfer link juice (PageRank) to the destination URL.

2. Canonical Homepage URL issues. The URL of your homepage might be displayed in any number of variations, which is fine for visitors but not for SE spiders; they’ll count each page separately. Part of the search engine ranking equation is calculating how many external links a given page has. If you have even 2 iterations of your homepage (e.g. http://site.com and http://www.site.com) then Google will count your homepage as two separate pages. Thus, if you have 1,000 inbound links to http://site.com and 2,000 links to http://www.site.com, Google will count those separately (effectively watering down your link juice) when you really want them to count it as 3,000 links for one page. What to do? Consolidate all versions or iterations of the homepage to one URL.

a.) Decide whether you want a www or non-www URL and then redirect the loser to your preferred choice. Thus, if you choose “http://www.site.com/” then create a 301 direct to that URL from “http://site.com/”

b.) Remove default file names at the end of your URLs. Below are some examples. You should create a 301 redirect to your preferred URL for any variation.

  • http://www.site.com/index.html
  • http://www.site.com/index.htm
  • http://www.site.com/index.php
  • http://www.site.com/default.html
  • http://www.site.com/default.php
  • http://www.site.com/home

c.) Consistently use your preferred URL in every internal link on your site. If you choose the www version, then all links should build off of http://www.site.com/

d.) Go to Google Webmasters Tools > Site Configuration > Setting and set your preferred domain.

Title Tag Review

1. DMOZ hijacking. If you’re listed in DMOZ, be sure to use to prevent DMOZ titles and description

2. Yahoo! hijacking. If you’re listed in the Yahoo! Directory, to prevent Yahoo! titles and description

3. Look for missing/duplicate/long/short Title Tags by going to Google Webmaster Tools > Diagnostics > HTML Suggestions.

4. Run your site through the Yahoo! Site Explorer and do a Google Site: command search (search for “site:www.example.com”) and make sure Title Tags are:

a) Unique. Does every page have a unique Title Tag? Make sure Title tags aren’t missing or duplicated; 3.5 million web pages on Google have no title 80k have a title that says “insert title here.”

b) Descriptive. Title Tags should be descriptive and incorporate relevant keywords & the brand name (e.g. “Organic Cotton Sheets | Example Brand” or “keyword > category | brand”)

c) Relevant. Title Tags should reflect page content and uppermost page heading (H1.)

d) Succinct. Title Tags should be less than 65 characters long.

Meta Tag Review

1. Make sure Meta robots.txt isn’t blocking the whole site. JaneAndRobot offers a great guide for managing Robot’s access to your website.

2. Go to Google Webmaster Tools > Diagnostics > HTML Suggestions and look for duplicate/long/short Meta Descriptions.

3. Run your site through the Yahoo! Site Explorer and do a Google Site: command search (search for “site:www.example.com”) and make sure Meta Tags are:

a) Unique. Every page should have a unique Meta Description.

b) Descriptive. The Meta Description for each page should describe the page content succinctly and accurately.

c) Actionable. The Meta Description should be written as an “advertisement” for each page, encouraging users to click on the link in search results. Write with sizzle, but be honest, accurate, and descriptive. Don’t bait and switch!

d) Optimized. Meta Description and Keywords should display targeted keywords to indicate the content to visitors?

e) Succinct. Meta Descriptions should be less than 155 characters with spaces and Meta Keywords should be less than 250 characters with spaces.

Content Review

1. Quality content. Does the website contain high-quality, unique content? Unless your site’s content is better than your competitor’s, then you’re always going to be fighting an uphill battle. As I said in the SEO introduction article, you should only ever rank as high as you deserve to rank.

2. Human-focused. Content should be written for humans and not for search engines.

3. Heading Tags. Every page should contain an H1 tag that is optimized for relevant keywords.

4. Keyword optimized. Pages should be optimized for only one keyword per page, with the keywords used 4-5 times per 500 words.

5. Internal links. Pages should include links to other pages on the site using keyword-optimized anchor text.

6. Enough Content. Every page should contain 200+ words of HTML text.

Duplicate Content Review

If you have thousands or even hundreds of products on your site, you probably have at least some repetitive copy. Flag it.

1. Search for a content string (sentence) in Google… your search should return just one page on your site, not multiple.

2. Check the amount of content on each page. Pages with images but little content are seen as low-quality by the search engines, and may even be seen as duplicates of one another.

3. Use a tool like Copyscape to see if other websites are stealing your content.

4. Clean up duplicate URLs with the Canonical Link Element. Google, Yahoo, and Microsoft support a link element to clean up duplicate URLs.

Let’s say you have a page about Organic Cotton Sheets that is accessible by multiple URLs (http://www.example.com/product... and http://www.example.com/products/b335/?cm_src=rel.) You can specify in the HEAD part of the document the following:

That tells search engines that the preferred location of this url (the “canonical” location) is http://example.com/organic-cot... instead of http://www.example.com/product... or http://www.example.com/product... Review

1. Alt attributes should be used on all images

2. Image filenames should be short but descriptive and use targeted keywords. Don’t use long filenames or practice keyword stuffing.

Good = “organic-cotton-throw.jpg” or “spiked-eggnog-recipes.jpg”
Bad = “image27.jpg” or “eggnog.jpg”

3. Image files should be stored in one directory (e.g. “http://www.example.com/images/... Link Review

1. # of Links. The general rule of thumb is that there should be fewer than 100 links on a page, mainly to 1) keep from souring user experience and 2) to keep from dividing PageRank to so many links that the links carry only a minuscule amount of linkjuice. That said, if you have a high PageRank page Google may spider up to 200 or 300 links.

2. Nofollow. Use the nofollow attribute ONLY when linking to sites you don’t necessarily trust (e.g. links in user comments) or to non-essential pages that wouldn’t be helpful to include in search results (e.g. “add-to-cart” links and the shopping cart page.) That said, Matt Cutts recently said, “given the way that Google works since this [PageRank sculpting] change, I would let PageRank flow even to your privacy and terms-of-service type pages. Even those sorts of pages can be useful for more searches than you would expect.”

3. Paid Links. Don’t buy or sell paid links that will flow PageRank and attempt to game Google’s search results. Google is okay with some paid links (see quote below,) but be sure to use the nofollow attribute and clearly identify those links as being paid/sponsored.

If you want to sell a link, you should at least provide machine-readable disclosure for paid links by making your link in a way that doesn’t affect search engines… For example, you could make a paid link go through a redirect where the redirect url is robot’ed out using robots.txt. You could also use the rel=nofollow attribute… The other best practice I’d advise is to provide human readable disclosure that a link/review/article is paid. You could put a badge on your site to disclose that some links, posts, or reviews are paid, but including the disclosure on a per-post level would better. Even something as simple as “This is a paid review” fulfills the human-readable aspect of disclosing a paid article.

– Matt Cutts, the head of Google’s WebSpam team.

4. Anchor text. All internal and external links should make good use of Anchor text (the clickable text in a link that is placed within the anchor tag .) Anchor text helps users and search engines understand what page you’re linking to is about, and should be short, descriptive and on-topic.

Bad = “click here” or “read more”
Good = “The previous article in this series gave an introduction to SEO.”

5. Link formatting. All links in body copy should be underlined and a different color than the body text to help users clearly identify links. Best practice is to use blue underlined text for hyperlinks, that turns to

Inbound Link Review

1. Measure the number of inbound links to the site using Yahoo! Site Explorer, the SEO for Firefox add-on, or Backlink Watch.

2. How many links are from .Edu sites?

3. How many links are from .Gov sites?

2. Directories. Has site been submitted to directories such as DMOZ (free,) Yahoo! Directory ($300/yr) or BOTW ($100/yr) to gain high quality backlinks.

3. Inbound anchor text. Just as links on your site should use optimized anchor text, links to your site should use optimized anchor text. If the majority of links to your site use inadequate anchor text (e.g. “click here”) then flag this for later. During the link building stage, as part of your relationship-building efforts, you should reach out to webmasters of sites that link to you and ask them to use certain anchor text for those links.

4. Is there a natural distribution of links on the site, or do 50% of the links directed to 1 page?

5. Is the site receiving any links from Free For All (FFA) sites or Link Farms? End the practice now, your site could (and should) be banned for this. Also be careful about link swapping…

6. Avoid excessive reciprocal links.

Geo-Location Review

1.If you want to appear in Local Search results or Google Maps, then make sure a local address and phone number appear on every page.

2. The top level domain (TLD) should reflect the primary country you’re targeting (e.g. “.com” for the US, “.co.uk” for England.)

3. The IP address of your web server should also reflect the geolocation you’re targeting.

Semantic HTML Review

1. Page file size should be less than 150kb before images, CSS, and other attachments.

2. Page load time should be less than 10 seconds on DSL. While you won’t be penalized for a load time of 11-16 seconds, long page load times provide for poor user experience and may prevent spiders from crawling your site. You can test page load time using URI Valet.

3. Check for poorly formatted code. You don’t have to have 100% valid code, but eliminate as many errors as possible to maximize user experience and crawlability.

a) Check the validation of your xHTML markup using the W3C Markup Validation Service.

b) Check your CSS using the W3C CSS Validator.

Back-End Review

1. Determine when the site underwent its last major redesign. Significant changes to navigation, content, and URLs can have an impact on SEO.

2. Evaluate the Content Management System. Is it up-to-date? Easy-to-use? Is it going to give your in-house team or SEO consultant a mental breakdown 4 weeks into the project?

3. Evaluate your host and server. Are you on a shared server, VPN, or dedicated server? If you’re going to increase the traffic to your site, can your server handle it? Slow servers not only drive visitors away, but also impact indexing of your content by search engine spiders. If your servers are slow, the spiders will stop indexing in order to keep from crashing your servers.

4. Domain age. When was the domain first registered?

5. Domain expiration date. Check with your host to see how long until your domain registration expires. Best practice is to keep your domain registered for 5+ years out. If your domain registry expires within the next 3 years, renew it now.

Establish Benchmarks

Create an SEO Spreadsheet and include the following information:

1. Inbound links according to each major search engine

2. Indexed pages on each major search engine

3. Visitor information compiled from your Web Analytics software

4. The PageRank for your homepage and most popular pages. I recommend using Firefox and installing the following plugins: SearchStatus, SEO for Firefox, and RankChecker.

5. If you publish a blog, burn your RSS feed through Feedburner and keep track of your subscribers and reach.

6. How your site is currently performing in the SERPs.

  • Google Local
  • Google Main
  • Google Webmaster Tools
  • Bing Maps
  • Bing Local
  • Bing Webmaster Central
  • Yahoo Local
  • Yahoo Main
  • Yahoo Site Explorer

Sign up for Web Analytics Tools

1. Sign up for Google Analytics (it’s free.) Let it run for 2 weeks before executing any SEO. You’ll need to establish a baseline of how your current traffic in order to accurately measure your success later on.

Note: While Google Analytics is a great tool for user analysis, it has some serious deficiencies as an SEO tool. 1) It uses page-tagging technology that’s only capable of recording visits by browsers using Javascript, and thus doesn’t record visits by search engine spiders and 2) GA doesn’t allow you to view or upload log files, which include much more granular information. Solution? Invest in a log file analyzer such as Sawmill ($99+) or WebLog Expert ($75+)

2. You should also consider using other free analytics programs; it’s best not to rely on a lone data source. You can read this article on some of the best free web analytics tools, but looking at Piwik or Woopra would be a good place to start.

3. Now is also a good time to decide whether or not you have the budget to pay for a subscription-based web analytics or SEO service, such as Omniture SiteCatalyst, Mint.com, VisiStat ($30/mo or $270/yr,) or WebTrend’s Analytics 9. If you know you’re going to make a purchase, then do it now so that you can establish at least 2 weeks of benchmark data.

4. Also, if you have the budget to do so then I highly recommend you sign up for a SEOmoz Pro Account ($80-230/mo) to gain access to some incredible content and SEO tools.

5. Once your analytics software has been running for 2 weeks,

  • How many visitors are you getting on a daily/weekly/monthly basis?
  • Which search engines drive the most traffic?
  • What other sites refer the most traffic?
  • What keywords are you already ranking well for?
  • What pages are the most popular?
  • Least popular?
  • What’s your bounce rate?
  • What’s the average time on page?
  • What are the average pageviews?
  • Set up Goals on Google Analytics. What’s your conversion rate?
  • What’s your average transaction amount?

Establish Objectives

1. Evaluate how increased traffic could affect your business. If you have an e-commerce site, your SEO efforts should prioritize keywords that will bring you traffic for 1) products with the highest profit margin, 2) new products and 3) underperforming products. Will dig deeper into this in Step 3: Keyword Research + Selection, but it’s important to start thinking about it now. If you have a traffic/destination site, how many additional visitors would it take for you to generate more revenue on advertising?

2. Set short-term and long-term goals. It’s not enough to execute SEO; you need to establish goals that are Specific, Measurable, Attainable, Realistic, and Timely (S.M.A.R.T.) Do you want to increase traffic? Sell more of a certain product? Be more popular? Increase newsletter subscriptions? Increase your RSS feed subscribers? Generate more revenue on advertising? A good S.M.A.R.T goal for SEO would be to “Generate __% more visitors in the next __ months in order to generate ___ more leads per week and increase sales by $__”

3. Establish a budget. Are you going to perform SEO in-house or hire an SEO consultant? (shameless self promotion: email me) If you’re doing it in-house, then you’ll want to consider purchasing subscriptions to the web analytics tools and keyword research and discovery tools. There are some free services available, but like all things in life, you get what you pay for. You don’t have to sign up now, but it’s important to establish a budget so you know what you have to work with when the time comes.

This is text

Back to top