Search engine optimization should begin the day you are thinking about creating a web site. The structure of your web site should cater to your users and the search engines. That being said, URLs are the most fundamental part of your site. (Content is king, but you have nothing if you have no URLs) So, to begin working on URLs, you should first do your keyword research. This will come in handy not only for the URLs but it will also save you time later, as you will need it for your metadata and content. You will also need to decide how your site structure and URL structure will work but first I recommend you finish reading this blog.
Those setting up sites should be conscientious of the effects their URLs are capable of having in terms of search engine rank. It is infinitely easier to do this the right way from the get-go, as opposed to going back and changing the URLs later in the game. HOWEVER, if you have a pre-existing site, read on; you can change your URLs to optimize your site, it’s just a little more work. It’s important to keep your URL structure consistent throughout the entire site, so you will need to change your whole site not just a few pages.
Without further ado, let’s start with the basics of URL optimization:
Keep it Short and Sweet:
Google’s crawlers put less weight on terms to the far right of the domain. In other words, the closer the keyword is to the domain, the higher the value it is given. It’s recommended that keywords are no further than the third to fifth term to the right. Also, according to a study, shorter URL’s are clicked more often than their longer counterparts.
All Keywords Are Not Created Equal
Sure, having keywords in the URL are more beneficial than no keywords at all. Keywords in spot 5 or before, are better than keywords after term 5. Also, keywords in the filename position are better than keywords in the subdirectory or subdirectory name.
Your Personality Should be Dynamic, Your URLs Should Not
Search engines prefer static URLs over dynamic URLs, meaning your URLs should not contain ampersands, equal signs, or question marks. However, if you must have dynamic URLs , cut it down to what you really need. This goes back to bullet one, keep it short and sweet.
Keywords should be separated by hyphens. This makes it easier to read for people and for search engines to understand your URL structure as well as where each keyword ends and another begins. Hyphens are best practice over underscores, although that may change in the future. Try, if possible, not to use hyphens in the domain.
Sometimes Less is More
As we all know, search engines do not like to be fooled. That being said, there are serious consequences for keyword stuffing. Although, you most likely won’t get penalized too hard for keyword stuffing your URLs, it’s still bad practice. Pick the best 2-5 keywords and try to utilize those in the URL as well as throughout the content and metadata. Anymore than that starts to look spammy.
Change is Bad
Your URLs should remain unchanged so that search engines are able to find the pages that they had previously indexed. If you would like to change the structure, you should redirect your old URL to the new URL. This will ensure that Google and other search engines are able to transfer your link juice and PageRank without hurting your site.
Don’t Save The Best For Last
Some search engines discount pages with a large directory depth. A directory depth works like this:
www.Domain.com has a directory depth of 0.
www.Domain.com/directory has a directory depth of 1.
www.Domain.com/directory/subdirectory has a directory depth of 2. (Notice, there are two backslashes)
There aren’t usually indexing issues because of directory depth but the weight can be discounted for pages that are multiple levels down in the directory structure because some search engines considering the pages nearest the domain to be the most important pages on the site. Generally you should keep your content up to 2 or 3 subfolders deep in URLs and separate multiple keywords with dashes.
On that same note, directory names should be created for the search engines and for users. As previously discussed, directory names should contain keywords so that search engines and humans can identify what the page is about.
To ‘www’ Or Not to ‘www’? That is The Question.
It’s common practice for people to create multiple URLs for one web page. I.E. you might want domain,com and www.domain.com, to make sure you are catching all traffic but beware, search engines treat each URL as a unique site and, thus, view this as duplicate content.
So you pick one URL, then what happens to the rest of the traffic? The best practice to combat this issue, Is to create a permanent 301 redirect to forward your home page URL to a preferred domain. You can use Google Webmaster Tools to set up the preferences for the www or non-www version.
Ditch the I.D.
Each time the spider comes to a URL with a session I.D., it find a different URL. As we just discussed, the spiders view each URL as a unique page so a URL with a unique session I.D. appears to be a duplicate page, and Google assigns zero importance. Then, when the Googlebot tries to crawl the original page, it receives an error. Plus, everyone linking to your page will be linking to a different URL, which means every inbound link will be directed to different URLs and will not improve your page rank.
If you follow all of these tips, you should have a well optimized foundation for your site. Next, I’d suggest creating metadata and writing content for each page, remembering to use the keywords that you had chosen for your URLs. Since the keyword research is out of the way, you’ve paved the way for a speedy optimization process!
Amy is an Account Executive at Hanapin Marketing, a search engine marketing firm focused on generating results through PPC and SEO.