By John Thyfault, Vice President of Search & Social Strategy, Beasley Direct Marketing

technical SEO audit - search in the network

Technical SEO Audit Overview

There are three main aspects of a technical SEO audit site review: architecture, technology and source code. The combination of these three elements being optimized for search will allow the content on your site to stand out from your competition. We’ll look at how to do a content review on your site in our next post. This posting discusses how to do a technical SEO audit review on your site.  

With the burgeoning importance of Internet marketing, the Marketing Department and IT find themselves increasingly thrown together to achieve their goals. This is particularly so in search engine optimization (SEO). Optimizing sites for search engine discovery requires Marketing understand some of the boiler-room workings of Internet sites. Merely understanding how to use keywords isn’t enough any more. Performing a technical site review requires the participation of both Marketing and IT in a true partnership. Marketers need to understand the elements and tools used to make websites sing. They need to know what can make a site hard to crawl, resulting in a loss of search engine traffic.

The review should examine the site as others see it. Keep two key audiences in mind. First, search engines and directories (“spiders”). Secondly, people (prospects, customers, vendors, partners, shareholders, etc.). The purpose is to uncover anything that might block search engines, and result in Google reducing its evaluation of the site’s value. Or, result in site users being unable to do what you want them to do on your site.

Architecture

The site needs a consistent layout and structure that emphasizes the content where you want visitors to focus. Where are the logical “buckets” of content? Sometimes it appears logical to have content in more than one place on the site. For example, if you are selling motorcycles and motorcycle accessories. It may make sense to you to have the information about Harley-Davidson T-shirts in the Harley-Davidson section of your site. Then, repeat this information in your apparel section. However, Google doesn’t want to see the same content repeated in different areas of your site and will ding you for it.

Once the content is organized, you need to look at directory names. Every directory name has the ability to be scripted with keywords such as Motorcycle Helmets>Full-Face Helmets>Bell Helmets. Don’t let these descriptors get too long; you need them to communicate crisply. Longer descriptors may confuse or mislead, and usability is the guiding principle. The longer people spend on your site, the greater the value assigned by Google.

There are tools you can use to crawl the site and look at it the way Google sees it. One excellent tool is Screaming Frog, which has free downloadable versions for Windows, Mac OS and Ubuntu. The Screaming Frog SEO Spider is a small desktop program you can install on your PC or Mac. It crawls websites’ links, images, CSS, script and apps from an SEO perspective. Screaming Frog will tell you where it has crawled and the pages it has found, reports 404s (missing pages), redirects, broken links, duplicate pages, and more. Moz.com is another great tool. These programs will allow you to gain a good top-level understanding of architectural issues, including how pages related to one another (or don’t).

Technology

There are hundreds of content management systems available, from WordPress to Joomlah! Most of these systems incorporate SEO best practices. They have a default set of built-in options that can help or hurt SEO. So these need to be carefully examined before implementing on your site. You particularly want to be able to look at and modify the page titles and meta descriptions. For example, the generic installation of WordPress—one of the most popular content management and publishing systems—does not allow this. But, a plug-in from Yoast allows you to categorize WordPress pages with unique titles.

META Data

Title tags are good drivers of SEO rank if they are composed well. Title tags get picked up immediately by crawlers and help the search engine to rank your page in comparison to others. In addition, a well written meta description that is concise and has a benefits statement in it, will help to increase the click-through rate when your site comes up in the search results. A standard but not very useful meta description might say, “Full-face motorcycle helmet.” While, an effective meta description might say, “Full-face motorcycle helmet meets highest safety standard.” Do research around your content and make sure the metadata relates to it closely. 

FLASH vs HTML 5

Some of the technologies used in building websites can prevent the site from being effectively crawled. FLASH technology for movies is fine, but sites built on FLASH cannot be crawled thoroughly. Spiders can read only the links, and search results are poor as a consequence. Also, some FLASH pages can have one URL, but have three or four content pages associated with it. You need to be aware of this and associate one page of content with one URL in FLASH.  A best practice is not to rely on FLASH to build your entire site. Only use it on a page-by-page basis to enhance sections of the site that will benefit from the interactivity of FLASH. Additionally, many of FLASH’s features can be offered in a much more friendly HTML 5 format. HTML 5 doesn’t raise barriers to search engines as FLASH does.

AJAX

AJAX is an acronym for asynchronous JavaScript and XML. AJAX is a group of interrelated web development techniques used on the client side to create asynchronous web applications. They allow great interactivity, such as mousing over a photo for more detail. They also allow information to load in the background. This is great for user experience. But, unless you code the page correctly, it can confuse and block search engines from fully indexing the content. Search-engine friendly AJAX is well understood and can be implemented by most competent web developers. However, it often must be listed as a specific requirement when building a site.

The lesson here is to use these technologies as an adjunct to your main content, and only where they achieve a useful purpose, such as usability.

Source Code

The average site has a code:content ratio of 80:20. This means that one in five “words” on each page is user-friendly content. The rest is “spider junk food,” and it doesn’t help SEO. Offload code segments into cascading style sheets (CSS) and Javascript, including files. Aim for a code:content ratio of 55/45 or better.

Incorporate key phrase-rich names for files and directories that mirror the content of the pages. This will increase Page Ranking by increasing relevance. (Remember, anything that makes Google’s search results more accurate will help your ranking.) Key words and key phrases should be used throughout the copy, including headlines, subheads and copy. They should be repeated about five times per page (taking care not to sacrifice the quality of your content). Make sure they are used in page titles, file titles, link text, ALT tags and metadata within the media files. (Content creation programs have a means of embedding key words and phrases within the files they produce.) This will assure that when users perform a Google search, the right words and phrases will be included in the brief description search engines provide for each site.

Be aware that search engines—Google in particular—take into account where your site is hosted. If your site is hosted on a server where your virtual neighbors are undertaking dubious activities like phishing, living in a bad neighborhood will taint your site ranking. Your hosting server should be above reproach.

More Source Code Ranking Factors

Search engines also take into account how fast your site loads and how quickly it can be browsed. Consider if your host server is optimized for your site. For example, if your server is in Connecticut and you are selling surfboards to Californians, images and media files may take longer to load for users. One way to test this is to use Google’s PageSpeed Service. Enter any URL and Google will give you a ranking for it and offer suggestions to make it load faster.

Make sure all images are optimized for compression. Have scripts load content first and interactivity second. If you have audio or video files, host them remotely to make the site loads faster without them. As they are large chunks of code, they can seriously slow down a site. Use a content delivery network to load these files quickly.

In conclusion, most websites tend to grow organically. Things are added over time. Website managers  come and go, and sometimes the “rules” you started with get lost in the shuffle. I recommend doing a technical SEO site review at least at least once a year or whenever you are undertaking major revisions to your site’s structure or content This will assure that broken links, 404s, misdirected links, orphan pages, old content and other detritus gets cleaned up regularly. It also assures that, like your car, your site gets a regular tune-up to continue to perform at optimum speed.

* * * *

About the Author

John Thyfault PhotoThis post was authored by John Thyfault, Vice President of Search & Social Strategy, of Beasley Direct Marketing. Contact John at jthyfault@beasleydirect.com.

John has more than 18 years of marketing, sales and product development experience, and he brings a proven track record of successful campaign, program and product development expertise. His knowledge of search engine optimization and marketing, combined with an in-depth understanding of customer identification, market analysis and segmentation, allows him to deliver high returns on our client’s marketing investment for both business-to-consumer and business-to-business markets.

Prior to working with Beasley Direct, John was Senior Client Services Project Director at ThirdAge.com, a first wave baby boomer lifestyle and community website. At ThirdAge he successfully led major client sponsorships for Fortune 100 companies in healthcare (Tylenol), financial services (American Century), technology (Intel & IBM) and consumer products areas (Revlon & Viactive). He was responsible for strategic and tactical goal setting, project management, new product creation and web site production. John previously worked in Channel Marketing and National Account Sales for IDG Books Worldwide, the publishers of the immensely popular “…For Dummies” book series. Additionally, he managed the wholesale distributor sales channel for Tor/Forge Books, an imprint of St. Martin’s Press.

John is active in local marketing associations, including the Direct Marketing Association, the Business Marketing Association and is currently serving on the board of directors for the Silicon Valley American Marketing Association.

John has taught search engine marketing fundamentals extensively. He has led workshops for the Silicon Valley American Marketing Association, Northern California Direct Marketing Association (DMAnc.org) and the Business Marketing Association. He also teaches Search Engine Marketing at (UCSC Extension in Silicon Valley).