SVP, Media Strategy

Top 3 Site Design Errors That Kill Organic Traffic

Websites are not meant to be static properties. Updates and redesigns that add useful new content and improve upon the user's experience on the site are vital to the health of any web property. Visitors are motivated to return to your page often to get new information and see what's changed. Search engines recognize the value of a well-maintained site and factor that into the all-important search page ranking algorithm.

But as helpful as a site redesign can be, without careful planning and execution there is also the potential for major detrimental effects on a site's organic rankings. Check out these top 3 mistakes you should avoid when undertaking a site redesign.

 

Improperly Configured Robots.txt


A website's robots.txt file gives special instructions to the myriad of automated web crawlers out there on how to interact with your site. robots.txt can be used to direct certain bots not to crawl your site or identify specific folders and files that you would not like bots to crawl. Directing bots away from less noteworthy content, pages behind a login that bots cannot access, or sensitive folders in your directory can help you shape what search engines index and display on your site. However, an improperly configured robots.txt file can be distrastrous for your site's organic rankings.

If a brand is developing a major site redesign, they will usually set up the changes in a development or staging enviroment separate from their production instance. The developers will set up a robots.txt file to block these staging sites from being crawled by search engine blocks, intending to delete them when the new site goes live. However, if the pivotal deletion step is forgotten, the Robots.txt file barring crawling of the site will remain active and apply to the live version of a site. Without the ability to index the site, Google and other major search engines will be begin harshly penalizing the domain. We've seen organic traffic drop as much as 35% and Top 10 keyword rankings drop nearly 30% in as little as three weeks with an improper robots.txt file present.

So if you find yourself with a sudden, severe drop in organic traffic following site changes, what do you do? Use these steps from Google to test your Robots.txt file and see whether your site can be indexed properly by search engine crawlers. If you site is unindexable, reach out to your webmaster to make the necessary changes to the file, and submit your website to be re-indexed to the major serach engines. Learn more about robots.txt.

 

Sitewide Boilerplate Meta Tags


Similar to the Robots.txt example, there are meta tags that can be put in the head section of your page's html to instruct bots not to crawl that particular page. NoIndex and NoFollow are useful when deployed properly to pages with non-essential information that should not be indexed like registration confirmations or "thank you" pages. However, with the rise of template based content management for websites, many times the head section of a site's html is controlled in one central file and deployed across many pages. A simple copy and paste error could result in NoIndex or NoFollow tags appears across all the pages of your website. We've seen this issue impact a heavily trafficked site in as little as a day, dropping organic visits over 70% and falling as far as to 0 organic visits before the problem was corrected. Once the problem was addressed, organic traffic and impressions slowly began to climb to their previous levels.

Sounds intimidating right? But careful push-live procedures during major site updates scanning for NoFollow/NoIndex tags can help you avoid running into this problem. In addition, having a tight watch on your site analytics and monitoring organic traffic often can help you identify when there is a problem quickly and mitigate the damage caused.

 

Failing to Implement 301 Redirects


For major site redesigns involving a new CMS or code management system, sometimes there will be changes to page URLs or retired pages leading consolidating content to live on a new page. Unfortunately, there is no way for search engine bots to understand these changes without explicit instructions. If the new pages launch without any connection to the location where that content formerly lived, all of the accumulated ranking power of that previous link will be lost. Consequently, organic traffic to those new pages will sharply dip compared to the old pages. In fact, we've observed this phenomenon first hand on a site with a large catalogue of product pages lose 80% of page views within two months of deploying their site redesign.

The simplest way to avoid this sharp drop in traffic is to carefully organize and implement 301 permanent redirects for all affected URLs. It might be a time intensive undertaking for a large scale site redesign, but creating continuity for that ranking power as you transition to your new site is well worth the labor involved.

It would be impossible to list all the ways a site's organic rankings could conceivably be affected by a site redesign, but being on the look out for these 3 major errors will give you a leg up. Of course, the most important method of caring for your site's SEO is careful monitoring of your site's traffic trends by an experienced SEO team. To learn more about how Rise can help you avoid more errors like these and develop a strong SEO strategy, contact our Digital Strategy team.

02/09/2015 at 11:00