SEO Myths: Things that won’t help your site rank higher

SEO Myths: Things that won’t help your site rank higher

Search engine optimization (SEO) is a constantly changing game that requires a great amount of time and effort. It also requires keeping up with Google’s algorithms, which are always in a state of flux and mostly kept secret. Sometimes all of the information floating around out there can be a bit overwhelming, especially when a lot of it consists of outdated or shady methods. Luckily, Google does occasionally speak out about what helps sites rank better in their search listings. And, just as importantly, they’ve also made it clear what won’t affect a site’s ranking – despite what years of myths may say.

Keywords meta tag At the top of web pages, there are typically a series of tags in the head of the document that store different types of information. Back in the day, one of the primary ways to get your website listed at the top of the search results was to use the keywords meta tag to specify a group of words and phrases you wanted the site to rank for. This is no longer the case today, in terms of Google’s algorithm. Years of abuse by spammers has lessened its value, which led Matt Cutts, a Google software engineer, to go on record in 2009 to say the keywords meta tag is now completely ignored by Google.

Description meta tag Unlike keywords, Google does still pay attention to the description meta tag. It sometimes uses the information there to display a site’s text snippet in their search results. But they have also said they don’t factor the description meta tag into their rankings. So it’s still a good idea to include a description – just don’t expect it to play a role in how high the site gets positioned in Google’s results.

Robots meta tag The robots meta tag supports a bunch of different values that search engines take into consideration when crawling the web and indexing pages on a site. For example, you can instruct Google’s system to not index a certain page. Or you can tell it not to follow any links on the page. But the robots meta tag is only necessary when you’re explicitly telling Google NOT to do something. Proactively telling it to index or follow a page won’t have any affect on ranking because it’s already the default behavior.

Submitting a sitemap Submitting an XML sitemap to Google is absolutely a good idea, especially if certain pages are buried deep within a site’s structure and hard to find. A sitemap also lets Google know when a page was last modified and how important each one is relative to the others. It’s important to note, however, that none of the information in a sitemap will affect the actual ranking of a site’s pages. It merely lets Google know the pages exist, which might help them get indexed. Where they are positioned in the index is another story.

Focusing only on PageRank PageRank is a proprietary system that was a big part of Google when it first came out. Since it came in a publicly available number, it was a measurable form of success that could be tracked and charted over time. The problem with this approach is that Google doesn’t rely on PageRank nearly as much anymore, if at all, when they determine their search results. They base their listings primarily on relevance, which is not a quantifiable value. Plus, a site’s public PageRank is almost always out of date and inaccurate because Google only updates it a few times a year. It’s a much better idea to focus on offering great content to your site’s visitors rather than chasing down metrics that don’t carry much weight.