Follow on Google News Industry News News By Place Country(s) Industry News
Follow on Google News | Google Panda/Algo Shift FixOkay, so this shift was a tough one to figure out as sites that were hit there were also sites with the same exact footprint didn’t get hit. I have never seen a shift like this as there is always some type of consistency.
By: simon birch Full Article here: http://www.seomarketingforums.com/ The only consistency early on in the Panda update was the use of AdSense. If you used AdSense to monetize your site, you chances of getting nailed increased. However, in the algo shift that happened about two weeks ago, we had sites without AdSense get nailed. How did that happen? Manual Review. Yes, manual reviews are back and according to the log files I have analyzed over the last few weeks, they are getting more and more frequent. Has Google rehired their manual reviewers? Maybe. Is Google finally going through the Spam Reports? Possibly. The bottom line is, manual reviews are happening again. Here is what my team and I did to fix the situation: 1) First things first as I have discussed numerous times, your site has to look solid. Make sure you have a great designer on staff or on contract. This is key. Without a solid design, you run the risk of getting a manual penalty than without one. 2) Manual reviewers are looking for scraped/hijacked content. Google still cannot detect duplicate content across domains, but they can detect in within the domain, so protect yourself and use CopyScape to find out if you have duplicate content issues outside your domain. Why do this if Google can’t detect it? Easy. While Google’s algo still can’t detect duplicate content, a manual reviewer can. If you run CopyScape and there are other sites that you’ve either copied your content or they have hijacked their content, you need to change the content enough so it doesn’t come up as a match any longer. Yes, I understand this can be a royal pain in the ass, but it is something you have to do. And while it may tick you off that you have to change YOUR content when someone else hijacked it, you just have to do it. The process here is fixing the problem and getting your revenue back as quickly as possible. 3) Stop running Scrapebox, XRumer, Scrapeboard, etc. on your money sites. While this is not universal, we have had a couple of them hit and so have a few members, so to be on the safe side, run these programs to feeders for now. 4) Fix all errors in Google Webmaster Tools for the site in question. 5) Stop manually/automation pinging the pages/posts for the site in question. 6) Remove AdSense, at least temporarily, if it is on the site. 7) Ensure no duplicate Titles or Descriptions are present on the site. 8) Ensure you have the non-www protection on your site. See the Webmastering Tutorial in the Membership area. Also ensure you have covered the area in the “Google Bug” section too. 9) Directory submissions to Best of the Web and Yahoo!. Both these are paid directories and they are a must. Don’t skimp, this is key. Now, there is how you can fix most site’s problems that got nailed in this recent change, however, there are some that got nailed real hard. Let me give you an example of what happened to us. We held the number one position for more than a year for a highly competitive term. And then this algo shift hit and the site went from #1 to #59. Other keyword phrases it was ranking for tanked too. So it wasn’t just one keyword it was multiple. In fact, the site didn’t rank for anything anymore. It was hit and hit hard. After waiting a week for it to “come back” on its own as sometimes rankings do, it got worse. Now, this domain was a 2010 branded domain, and since we are in 2011, it seemed like a perfect time to do some redirects. We did, to the 2011 version. Now, instead of doing a 301 on the entire domain, we just redirected any page that had ranked in the past to the 2011 version of the site but left all the other pages alone. The result? Three days later the 2011 site went from ranking #38 to ranking #1. This was clearly due to the power of the 2010 site. So why does a penalized site be able to pass its power onto another site? It just does and always has. Keep this in mind when you have feeders that drop and you can reallocate their power to other sites quickly. Will this always be the case? Probably not as Google has been proactive at closing loop holes in their system as of late. Rest assured, if it does happen, my team will be testing to find the solutions to keep you one step ahead. End
|
|