Online Marketing

Online Marketing – how to get visitor’s to your website and get them to convert to customers.

SEO Keyword Density And Latent Semantic Indexing

Online Marketing 11 Comments

Using appropriate keywords is a crucial part of any search engine optimization strategy.  Most website owners know that using relevant keywords is the only way to be found by Google or any other search engine.  Did you know though that using keywords too many times may in fact be detrimental to your SEO strategy?

For years, web site owners were using sneaky tactics like putting keywords in text that was invisible to the viewer, using backdoor pages to keyword stuff, and using irrelevant keywords to falsely rank high for terms that had nothing to do with their site.

While these types of tricks used to work, Google, Bing, and Yahoo have caught onto these tactics and are now deindexing sites that appear to be maliciously using keywords, whether intentional or not.  Even if you have a site which is perfectly legitimate in every other way, if you appear to be keyword stuffing by repeating your keyword too many times it will be detrimental to your SEO.  The key to succeeding with keywords lies in achieving the perfect keyword density.

google-by-carlosluna-on-flickrIdeally, your website should appear to be highly relevant to the keyword you are targeting, without appearing to be spam in any way.  Most SEO experts claim that the perfect keyword density lies somewhere around two to four percent.  Keyword density may be expressed as the total number of words in your web page, article, or blog post divided by the amount of times your keyword or keyword phrase is included in that text.

In other words, if your article has five hundred words and you include your keyword phrase ten times, that is a keyword density of two percent.  An article with that keyword density would appear to Google as being relevant to the desired keyword, without being  spam in any way.  On the other hand, if your article mentioned the keyword thirty times, that would be a density of six percent, which could potentially flag the article as spam by the search engines.

There are several ways to prevent your web page from appearing to be spam to the search engines.  One way is to consider several keywords your site might rank for.  For example, if you have a web page about different colour cars you may try to rank for blue cars, red cars, beige cars, and so on.  Ranking for several different keywords also gives your website the advantage of being able to rank for a number of search engine queries.  Clearly, it is more advantageous to rank highly in the search engine results for multiple queries rather than just one.

Another way to avoid appearing spammy is to consider LSI or latent semantic indexing when writing.  Search engines, in particular Google, have become quite sophisticated in their algorithms and can now detect the content of a page better than ever before.

In the not so distant past, if you typed in the phrase “automobile colours” rather than “car colours”, the website in our example above would not show up in the search engines result page.  Now Google is beginning to recognize that someone typing in the phrase “automobile colours” would probably be interested in a site talking about car colours as well.  In fact, Google has now gotten to the point where it expects that any given web page will automatically have LSI integrated into it.   Not including the phrase “automobile colours” in a site about car colours may actually be having a negative effect on your SEO. Why?

Because Google wants to provide the most relevant results possible to your search.  The ultimate goal of a search engine is to pick the result a human would select if they were personally trying to find the most relevant website to a given search query.

Humans think with LSI in mind.  People understand that automobile is a synonym for car. They naturally use synonyms when talking about a given topic.  Including an exact phrase too many times appears artificial to Google and therefore less relevant.  Using a lexical database to come up with alternative words people may use to find your website will ultimately prove to be a powerful SEO strategy as the major three search engines become more intelligent and sensitive to keyword density.

What every website owner, article writer, or blogger should take away from this article is the fact that one should write naturally in order to achieve the best SEO.  Do not overly focus on keywords when writing.  Chances are if you are writing about a topic relevant to your keywords, you will naturally achieve the appropriate balance of keywords and LSI phrases.

If you found this post on SEO keyword density and LSI useful, please share it by bookmarking it with the icons below. I would also love to read your comments, please add them to the comments section. Thank you very much for your time.

11 Comments to "SEO Keyword Density And Latent Semantic Indexing"

  1. Irina Benoit

    April 16, 2010

    Thank you, Great info!

  2. Goalranks

    April 19, 2010

    there are a lot of seo strategy, it depends on which on your choose. but good seo is preferable

  3. eagle09

    May 16, 2010

    thanks Harvey Raybould for your artical on SEO Keyword Density And Latent Semantic Indexing.It is really nice………
    read more,learn

  4. According to My Experience still Keyword density playing a great role in SEO

  5. Daphne Pomponio

    May 26, 2010

    I don’t endorse software very often but this new service is outstanding. It’s a keyword tool which has a database of millions of keyword phrases showing the adwords traffic count monthly together with the google competition count as well as other data. At a click of a button you will discover phrases with traffic but no competition and I have used it already to get pages and web-sites to the top of the various search engines, even with no backlinks. You can see a video of it in use here –

  6. Tony Hulls

    June 28, 2010

    The real problem is that the optimal keyword density varies alot, depending on the keyword. While 4% may be good for one niche, it maybe overkill and get penalized for another. So, before finding the keyword density that will probably word, it’s essential to analyze competitors who rank well. Nothing worse than overdoing SEO and then trying to fix it.

    • Harvey Raybould

      June 28, 2010

      Hi Tony,

      You make a very valid point, competitor research is vitally important in any SEO work. I would always recommend a more cautious approach rather than any black hat techniques, as it is easier to ramp things up if needed, rather than trying to get a site un-penalised.



  7. csp

    July 28, 2010

    Thank-you very much

  8. Fraser Scotcher

    September 23, 2010

    Thank you for the information – Keyword Density is certainly a minefield – however when perfected the results can be astonishing.

    • Harvey Raybould

      September 23, 2010

      Hi Fraser,

      No problem at all. Yes I believe choosing the best keywords and optimising your site for those keywords, is critical for SEO success.

      All the best



  9. Used Cars

    February 17, 2011

    The actual problem is the optimal keyword density which is not constant its always varies alot, depending on the respective keyword. some time we are getting some percentage of keyword which is penlized for other keyword!!!

Leave a Comment

You must be logged in to post a comment.