Tuesday, March 4, 2014

There are a few easily fixable problems that people run into when they are optimizing their meta data. For example, sometimes metadata is set up to disallow search engines from indexing a site. There are a few reasons webmasters set their sites up to block search engine, but those creating websites for SEO purposes should make sure search engines are NOT blocked.




If a website isn't being indexed in the search engines, generally the first thing a webmaster will do is make sure their robots tag is not blocking search engines. By default, if a site does NOT have a robot tag, search engines will index it. Many webmasters write a robot tag that essentially says "allow search engines to index this", but this is unnecessary as search engines will only avoid indexing a site if there is a robot tag that disallows them from doing so. (Or if it's been blacklisted, but that's a whole different issue.)





Another problem that people run into is giving search engines too much information. Sites are allowed a max of 10 keywords in their meta data. Here's an example:





Lisa's site has three meta keywords which are "Georgia peaches, peaches from Georgia, peach orchards." Gina, who is Lisa's competition has 12 keywords which are "peaches from Georgia, Georgia peaches, peach orchards, peach orchard, peaches, Georgia, Georgia peaches, Gina's peach stand, juicy peaches, peaches from the south, ripe peaches, fresh peaches"



Because Gina has more than 10 keywords and Lisa has only 3, Lisa stands a better chance of doing well in the search engine. The same rule applies to meta descriptions, except web developers are allowed a max of 150 characters for meta descriptions. Remember that it's 150 characters and not 150 words. While this isn't one of the biggest SEO mistakes a web developer can make, it still can negatively affect search engine rankings.


No comments:

Post a Comment

back to top