Sales & Support 1-888-220-7361

The Reciprocal Consulting Blog

You are Browsing the March 2013 Archive:

More and more, the search engines are using a process called Latent Semantic Indexing for categorizing search results. So what does that mean?

In a nutshell, Latent Semantic Indexing (or LSI) involves analyzing a web page to look for related words and phrases that can be substitutes for each other or help the search engine identify what that page is about. For instance, “car” and “automobile” are two words often used for the same object. If you write a web page about your blue 4-wheel drive automobile, based on the principles of LSI, that page could also rank for search terms that include the word “car” even if “car” doesn’t appear anywhere on the page.

This is an important concept to understand for content writers because it means you can play up these related keywords in your content without harping on them.

In the old days, you counted your keywords and tried to write your web pages with a certain keyword density in mind. In other words, the number of keywords per words on the page. You wanted “automobile” to appear 1% to 5% relative to the actual number of words on the page (i.e. 1-5 times for every 100 words on the page). That’s no longer the case.

Instead of counting keyword densities, with LSI you can spread your keyword usage around to all related keywords. You might use “automobile,” “car,” and “vehicle” interchangeably throughout your content, which is more like natural writing anyway.

Latent Semantic Indexing is the future of SEO. It means that writers can get back to being writers again instead of keyword managers.

There was a time when content writers got a little concerned that their keyword density was too high. In fact, many early SEOs taught people to ensure their keyword density was somewhere between 1% and 5%. It seems a little silly now, but a lot of us bought into that line.

Here’s the truth: Don’t worry, be happy.

Your keyword density is not the issue. It can’t be too high or too low. The idea is to write great content. If you write awesome content that speaks to your audience, it’s well written, and it provides value, then you can have a high keyword density. You’ll still get your page to rank.

The most important thing to know about web page content is that it must be good content. Quality content that adds value to the Internet.

The problem with content that doesn’t meet the quality standards of the search engines is that it is usually deemed as spam. That could mean too many keywords on the page, but if it does, it likely means that the keyword is used in a spammy way that doesn’t provide value to your audience. In that case, the problem isn’t a keyword density that is too high. Rather, the problem is that your content quality is too low.

If you’re not a writer, you’d be better off hiring a content writer. Hire a ghostwriter who understand search engine optimization and will write great content, not one who uses spammy techniques and voodoo SEO.

In light of the Penguin update you’ve probably been hearing a lot about quality content. In fact, since the first Panda update, every SEO in the world has come out in favor of quality content. It makes you wonder if they were in favor of quality before they got beat down. They certainly weren’t talking about it then.

So why are they talking about it now?

SEOs have always been interested in whatever is going to make their websites rank higher in the search engines. At one time that meant counting keywords and focusing on keyword density. Even after it was evident that keyword densities didn’t work, many SEOs kept advising their clients to count keywords anyway.

Then there was link counting. And anchor text manipulation. Link building became a spam game between SEOs to see who could acquire the most and the best links. Many of them won. Then along came Panda.

Getting boinked isn’t fun. Especially if it costs you money. But if you focus on producing quality content, then you don’t have to worry about getting boinked. And this hasn’t changed. Quality today means the same thing it meant in 1998. The only thing that has changed is that now every SEO on the planet wants to focus on it.

Quality content means writing content that your readers want to read. It means providing useful and valuable information on a topic that is important to your audience. If you can do that, you’ll rank for the right key terms.

A recent algorithm update by Google recently slapped popular article directory EzineArticles. The site lost traffic and commenced to cleaning up its directory to rid itself of problem articles. Then they underwent some quality changes of their own. One of those changes I find rather interesting.

Monitor Keyword Repetition – The use of any one keyword needs to be limited to no more than 2% of the total amount of words in your article. Take the total number of words in your article, and multiply it by 2% (ex. 450 words x .02 = 9 times). This will give you the maximum number of keywords permissible in your article.

In other words, keyword density is back.

Hold on a second, Lone Ranger. That’s EzineArticles, not Google. Don’t mistake this quality guideline for a search engine guideline. It is far from being that.

Search engine optimizers gave up on keyword density about five years ago – some of them before that. This guideline is likely EzineArticles’ attempt to control its own spam. They are trying to stay within Google’s good graces. And because of that, you might get the mistaken idea that Google looks at keyword density patterns. Don’t count on it.

I think it’s safe to say that Google does care about too much in terms of keyword usage, but I doubt that they are counting your keyword density. “Too much” is likely a moving target and changes from article to article.

Just thought I’d give you something to think about.

Veteran SEO Stephan Spencer wrote a blog post for Search Engine Land that has sparked a bit of controversy. In this blog post he wrote:

Ok, no one says “da bomb” anymore, but you get the drift. Monitoring keyword density values is pure folly.

A commenter took issue and wrote:

Folly? Hardly. If you’re trying to rank for a keyword, you want to make sure you use it a few times on a page. That’s just common sense. Of course, you don’t want to overuse a keyword, or it might come across as spammy. Any smart SEO pays attention to KW density.

The logic here is a bit spurious. There are two true statements followed by a non-sequitur. Yes, you must use your keyword enough times on a web page for it to matter. And, yes, if you overuse it then you might be tagged as a spammer and your web page de-listed, or diminished in rankings. But that doesn’t mean that keyword density is something you should be counting.

Rand Fishkin of SEOmoz says:

The formula for keyword density – a percentage of the total number of words on the page that are the target phrase – is indeed folly. IR scientists discredited this methodology for relevance decades ago. Early search engines and information retrieval systems already leveraged TF*IDF as a far more accurate and valuable methodology.

The Wikipedia link was added by me.

Back to keyword density. It’s not important. I’d say there are three keyword factors that are much more important than density:

  • Keyword Placement
  • Semantic Language Relevance
  • Anchor Text

This is not necessarily in order of importance.

What I mean by placement is the location within your web page of your keywords. The Title tag is very important. It’s the most important place for your keyword. First paragraph and last paragraph are also important. H tags are disputed, but I’d say they are somewhat important. I’ll stop there.

Semantic language relevance is a reference to the use of synonyms within a web page document. If you are writing about fighter planes and you mention Tomcats, Messerschmitts and Skytrains then those words will do more to rank your web page for the term “fighter planes” than using the phrase “fighter planes” with a density of 5% throughout your web page document. Don’t buy the keyword density hype.

Finally, anchor text is undisputed as a major search ranking factor. Use your keyword in your internal anchor text. It’s much more important than keyword density.

I’ll have to agree with Stephan Spencer on this one. Search engine optimization is denigrated with talk of keyword densities.