Have you ever had the sinking feeling of checking your favorite search term one day to see if your site has climbed a spot or two, only to find a complete disaster has occurred? I think the reasons sites suddenly plummet within organic search results fit into three classifications—the “Oops,” the “Duh,” and the “Dang, Got Caught.”
Most mistakes can be completely avoided by paying more attention to two things: SEO technical best practices and the search engine “rules.” The search engines do a great job of providing most of the tools needed to resubmit an adversely affected site, sub-domain or directory level.
1. Typically the quickest fixes are as a result of “Oops” mistakes.
2. “Duh” mistakes can be a little more severe, and take additional time to fix both internally and in terms of regaining trust within the algorithms.
3. “Dang, Got Caught” mistakes can take a long time to recover from — to the point where the domain should perhaps be written off as a business loss.
Oops And Duh
Accidents can happen with redesigning a site, and may have a negative effect on search engine results. For example, redirects may be improperly mapped, resulting in negative user and search engine experience, including a large increase in 404 error responses. Many developers like to employ a temporary redirect to an error page and serve a 302 to a 200. In English (LOL), this tells a search engine that the page is only temporarily moved, and then serves an error page which claims it is OK. I have seen Sitelinks (the extra links below a branded search result) lead to 404 pages for weeks and longer in some engines.
Other “Oops” and “Duh” mistakes include (among dozens) improper use of the robots.txt file and setting up long server downtimes without properly informing the search engines. A friend once told me that the robots.txt file is like a ninja sword, and thus must be handled very carefully. Long server downtime can cause the site to be removed from the index. If you have accidentally blocked your entire site or even some major directories from being indexed, this can be a relatively painless fix (measured in days to weeks) by resubmission within search-engine-provided toolsets such as Google Webmaster Tools. Without being set up and verified in these toolsets at Google and Bing, you take great risk in your ability to mitigate.
Dang, Got Caught
Unscrupulous SEO practitioners and high-risk-taking marketers are the most likely to be victim to the sometime severe backhand of the algorithms. There is not much to say in this area other than a number of clichés having to do with playing with fire. If you participate in tactics such as same color text as background or buying thousands of links at a time from unrelated domains, you will be caught. Large brands may be able to win their way back relatively quickly, but for others, it can take months or years of reinclusion requests. Search engine toolsets can help with this too. Rules, like laws, are open to interpretation. One last thing to always keep in mind is that search engine crawlers can’t see, but rather rely on what they can understand from the code as being presented on the page. If automated text readers can clearly understand your content, and it doesn’t differ from what is on the page, you are probably safe.
There are many free toolsets available on the Internet to see how search engines see your site, and the redirects and response codes they are given. Being prepared and knowing your site’s performance is the only way to keep yourself truly safe from disaster.
What are your experiences in seo mistakes and how have you overcome them? Share your thoughts in the comments.
Chris Boggs of Rosetta is a specialist in search engine optimization and paid search advertising. Chris joined Brulant in 2007 as the Manager of the SEO team, and Rosetta acquired Brulant in 2008. He is a frequent contributor to Search Marketing Standard magazine.