3 Things you Didn't Know are Affecting your SEO Efforts

Roberto Mejia
by Roberto Mejia on February 27, 2013 in Visibility
Share on Twitter Share on Linkedin Share on Facebook

SEOEven if your site looks like it has been optimized for search engines--with the proper content, keywords and links--there may be invisible factors that are hurting your SEO efforts without your knowledge. There are three such hidden factors you should particularly take a look at: URL canonicalization, robot instructions and page load time.



URL Canonicalization
When a web page has more than one possible URL, a search engine needs to be able to tell which URL it should use in search results. "Canonicalization" refers to the way one version of the URL is designated as "canon," or the standard URL to use.

Each page on your site may have multiple URL addresses without you realizing it. For instance, "www.example.com" and "example.com" are both the same page, but are technically two different addresses. 

Besides the "www" issue, pages can end up with more than one URL when your site uses relative internal links instead of absolute links. An absolute link includes the full URL, such as "example.com/contact.html"; relative links only include the file name of "/contact.html." Relative links can create duplicates if:

  • They point to the home page. By linking to "/home.html," the URL "example.com/home.html" is used, in addition to the direct "example.com" home page.
  • There are multiple variations of your domain name. If you use "myexample.com" in addition to "example.com," every page on the site will have two addresses.
Having duplicate URLs is a problem because you are splitting your search relevance; people can link to either version of the address, so each specific URL receives half as many links.

You can fix this problem by setting up one version of the URL as "canon" via a 301 redirect

Robot Instructions
Search engines find and index pages using automated programs known as "spiders" or "robots." There are two hidden files that can be used to give instructions to these robots:
  • "Robots.txt" is used to tell search engines which pages they should not index, generally because the pages are not meant for public viewing. If the robots.txt file is set up incorrectly, though, it can block all of your pages from appearing in search results.
  • "Sitemap.xml" gives instructions that help robots find important pages and index them more efficiently. Though sitemap.xml is helpful to search engines, many sites do not include the file.

Page Load Time
The amount of time it takes to load a web page affects human viewers more than it does search robots. If your site takes a long time to load, many customers may lose patience and quickly leave the site. This high bounce rate is obviously bad for business, but it can also be bad for SEO: it tells search engines that your site was not engaging or relevant to the user, and therefore should not rank as high in search results.

If your load times are slow, you may need to simplify things and cut down on file sizes or the number of scripts that must be run.

You can find these or other hidden problems by performing an SEO audit on your site. Find and fix these invisible problems, and you can get SEO benefits you can see.

* Image courtesy of FreeDigitalPhotos.net

lets_talk2.jpg
Roberto Mejia

Roberto Mejia

While specializing in web development and inbound marketing, Roberto Mejia prides himself in always learning and improving as much as possible.