Follow on Google News News By Tag Industry News News By Place Country(s) Industry News
Follow on Google News | Post Panda Step-by-Step SEO Site AuditAfter the Panda Update, going through an SEO Audit of your site is important. Here are the steps I go through. You will notice a lot of the steps are not “Pure SEO” steps, they are more “webmaster steps”, as SEO is easy. Seriously, it is.
By: Simon Birch FULL ARTICLE HERE: http://www.seomarketingforums.com/ Before you start doing any SEO, you have to know which keywords convert for you, so once you figure out the keywords which convert in PPC you optimize for them with SEO, however, before you get to that step, you have to make sure your site is ready. Let’s use a site I did as a review of a few years ago as an example. They aren’t a client or a member, nor am I an affiliate of theirs, so it should be a clean domain to use: patioshoppers.com. Here we go… Step One: Go to the site. If the site design is so bad that no one would ever buy, doing an SEO analysis is pointless. If this is for a prospect and they are unwilling to fix their design, you don’t want their SEO business. Period. Commentary: Their site design while not great, isn’t bad. For an eCommerce site it is functional enough in my opinion to do well. Step Two: Check for non-www protection. While you may be sick of hearing me rant about this, it can do WONDERS for your site in terms of internal duplicate content. More info. Commentary: The example site has the rewrite in place. Step Three: Do the “site command” to see the total number of pages in Google’s index for the domain. Commentary: Here is an example of the search to do in Google. siteatioshoppers.com Returns 2,370 pages. However, if I scroll to the bottom (I have my default set to 100 results per page and you should too) I see that there are only nine pages of results (instead of 24). Clicking on “9″ it shows 875 pages. Why? Google states there is duplicate content. The amount? 1,495 pages (over 60% of the indexed pages). That’s not good. Note: If you can’t change the default from 10 to 100 results per page, you will need to turn off Google Instant in your Search Preferences. Possible fixes: 1) Since the non-www page protection is in place, the complete protection needs to be in place too. This will handle the know Google bug that has been around for years. Simply go to the Webmaster Tutorial and go to the section “Google Bug” for the code for the fix. 2) Edit the Titles and Descriptions which duplicate or near-duplicate so Google sees the page as unique. 3) Edit the body content so it is up-to-date. Step Four: Check the domain in DomainTools.com. Here I am given a lot of solid information: a) Title & Description (verify they are compelling and relevant) b) When was the domain registered and are there red flags? c) Is the domain on a dedicated IP address or shared? Commentary: a) Title and Description are okay from an SEO stand point, but if you read them out loud, they really sound bad. I would suggest making them stronger in the compelling area. They are too “SEO’d” in my view with three pipe symbols and four keyphrases. The description doesn’t have any real “punch” to it. One Title has a great call to action “Buy with confidence now!” but the rest of the Titles just fail in my view. My rewrite would be: Title: Specialty Patio Furniture Shop with FREE Shipping – Always! Description: While you may argue that the Title and Description aren’t SEO’d, I would say, “So what?” you can use incoming anchor text to get the top rankings, but making the Title and Description compelling means that you can get more click throughs than your competitors even though they outrank you. b) Many of the large graphics on the site are missing ALT text. Not all the images have height and width dimensions which can cause problems in load time. Many internal links have query strings in the URLs which can cause indexing problems with Google. This is a classic case of the need of a rewrite solution. There is a rewrite in place, however, it goes to the singular version of the domain. c) The Keyword Meta tag is out of focus and contains keyword phrases such as “umbrellas” d) While it states the domain was registered in ‘05, checking the Registrar History we see there was one “drop” and the domain was actually registered first back in ‘02. You can check Archive.org to see what the old site looked like. e) The domain is on a dedicated IP address. Step Five: Check YSlow. a) Check Server Type and Version in bottom of browser b) Size of Page c) HTTP load requests Commentary: a) Their server is Apache, which will focus on doing mod_rewrite for their URL strings, but the version is 1.3, an outdated version. Regardless of what their host says, there are known vulnerabilities which are corrected in version 2 of Apache. The upgrade can occur without causing too much disruption (usually takes about two hours with at least one server restart). b) 379K is pretty big for a page, especially one that is serving the images dynamically so they aren’t counted in the amount reported by YSlow, so actually the load time is pretty heavy. This site could use having their images optimized. Just doing that alone could reduce the load by 35%. Bu the big issue is the JavaScript, it is nearly 1/3 of a Meg, which is a heavy load and could cause browser load issues. c) 112 HTTP requests is excessive and it is a problem. Most of the problem lies in the 54 JavaScript files. Yes, you read that right, 54. Step Six: Verify a Google Webmaster Tools account Look for the verification meta tag. If the prospect does not have a Google Webmaster Tools account, explain the importance of it. This account is vital if you want to ensure your presence in Google and to fix issues before they become problems. Don’t be paranoid about giving Google information on your sites. I have GWT accounts on all of my domains, including the Black Hat ones and nothing has ever happened. Honestly, I really don’t think Google cares. They have bigger fish to go after. Step Seven: Verify Targeted Keyword Phrases. Don’t let your client or prospect tell you which keywords they are going to target, let their website and GWT tell you. Check their log files to see the keywords they are currently getting traffic for and check GWT for the keywords they are getting “impressions” Commentary: You want to target the keywords that are getting impressions that also showed good signs of convertability with the PPC campaign that was run. Often, you can add 3-5 keyword phrases to your link building campaigns and in a matter of weeks or days you can significantly increase your ranking and get more traffic and more sales. Focusing on the right tools and data is key. Step Eight: Link Saturation You can run Open Site Explorer and look at the unique domain and total links for the top keywords that are being targeted. You can also look at the overall authority of the site. Please read rest of article here http://www.seomarketingforums.com/ Regards End
|
|