Tuesday, September 18, 2007

Latent Semantic Indexing Explained

There's a new game in town when it comes to how websites can gain search engine attention. The old ways of keyword stuffing, using single phrases to optimize and concentrating heavily on incoming links are fading away as a newer, more holistic system for judging content comes into its own. This system is Latent Semantic Indexing and it has webmasters scrambling to keep up.

Google is behind the creation of latent semantic indexing. This system, which basically means hidden meaning indexing, was designed to enable the search engine to better scan pages for their overall themes. It boils down to being a system that offers a more sophisticated way for the search engine to measure sites for their relevance in regard to individual searches. While latent semantic indexing doesn't require a complete redo of websites, it does mean that webmasters who focus on high quality content are more likely to get ahead.

Latent semantic indexing is an advancement of Google's designed to push it towards its mission to make its search results more relevant. The company has always stated it wanted to provide high quality, relevant results. Latent semantic indexing is meant to help make this happen. While Google's original system scanned pages for relevancy in regard to keywords, it also put a very heavy reliance on incoming links. The system overlooked sites that were new or had too much content added too quickly. Unfortunately, in the process of trying to weed out sites that were loaded with keyword stuffed, nonsensical content, the system overlooked good ones, too.

Google wanted a better way, and it was found. Latent semantic indexing is meant to scan the overall theme of a site so as not to penalize those sites that have fresh, relevant and good content even if they do happen to pop up over night.

Under latent semantic indexing, sites that want to gain ranking need to make sure they have content that's fresh, updated, keyword rich and relevant. The system is meant to give those who use keyword searches pages that better represent what it is they were looking for instead of those that happen to have bunches of incoming links. All in all, the system is a more fair way of measuring what's on the Internet in regard to relevancy and quality. It also fits Google's mission better.

The old days of Google putting 80 percent of its emphasis on incoming links and 20 percent on the actual site itself are coming to an end. Incoming links will always have relevance, especially in regard to breaking search "ties," but they may not carry the same weight as before. This can make it a bit easier for those who work on their sites with an emphasis on quality to see real results.

What all of this means to web publishers is that those who have done and continue to do their jobs correctly will have a better chance of shining with latent semantic indexing. Those who keyword stuff, create nonsensical content and spend a lot of time using link farms likely will not.

The key to getting ahead in the new age of Google search falls on quality. Sites that provide useful and relevant information in regard to their content will be likely to do better on searches. Those that cut corners could find themselves at the bottom of the search totem pole.


About the author:

Jeff Alderson specializes in boosting traffic and profits. He is also the inventor of numerous PPC and SEO tools. Jeff suggests using a keywords analyzer like Ad Word Analyzer to find keywords for your site.Don't reprint this article. Instead, reprint a free unique content version of this same article.