Make your Web site click with algorithm search engines … and customers.
You may be under the impression that getting your Web site ranked on “spidering” or algorithm search engines such as Google involves the dark arts, or at least a secret handshake. If so, you can’t be blamed.
Some search-engine optimization (SEO) companies make it sound as if high rankings are more hocus-pocus than strategy—that the algorithms (part computer program/part math equation) that determine page rankings are designed to estimate something other than how relevant a site will be to an Internet searcher.
While there are a number of complicated and technical aspects of SEO—also known as organic optimization—most can be boiled down to a few simple, key principles:
• Your site should be informative and relevant.
• Your site should be easy to navigate.
• Your site should be easy to use.
Content Is King
No single element of your Web site is more important, when it comes to search-engine optimization, than content. Content is what the search engine actually is searching for.
Part of what search-engine algorithms determine is what is called keyword density, essentially the frequency with which a search term appears on a Web page. If a keyword appears too few times, it’s deemed less relevant; if it appears too many times, it’s deemed spam by the spider. Why is this? Conventional wisdom would suggest that the people designing these algorithms determined through research that copy relevant to a certain topic would reference that topic a certain number of times as a percentage of total copy.
There’s no hard and fast rule for keyword density; search engines do not make public their algorithms. A rule of thumb I’ve culled from something of an informal straw poll of the industry experts interviewed for this story is that the 5-percent to 10-percent range is a good ballpark figure.