SEO & ASP.NET: Put keywords in the URL

See the first post in this series | Previous Post

Tip #3 – Put keywords in the URL, the sooner the better.

There are 3 documented areas that Google looks for keywords: URL, Title, and finally the body of your content. So if you are not embedding keywords into your URL then you are missing an opportunity to help increase the odds of your content getting pushed higher in natural search results.

How does Google find keywords in the URL?

Take the following 4 URLs (I’m also assuming you read the previous article about how you construct your links too):


There are 4 total keywords ‘seo’, ‘for’, ‘aspnet’, and ‘developers’.

Which format is better? Or does it really even matter? Hopefully it comes as no surprise that the first 2 examples are virtually identical and in neither case, from what my research has shown, does Google use any casing information to pull out keywords. So the first 2 examples are very bad choices for how to format your URL if you care about SEO. In the 3rd example underscores are used to breakup the keywords and in the 4th example dashes are used.

Google has stated that the preferred way to break-up keywords in the URL is to use dashes. Most modern content management systems and blog engines use this as the preferred method.

The ordering of keywords matters

Furthermore my research has also shown that the order of your keywords also matters and the domain name is considered for keywords too. In Community Server and Graffiti we automatically build the title and URL of posts based on the subject of the post entry. Community Server goes one step further and allows you to control the URL independent of the subject. We’ll add this functionality into Graffiti soon as well. The reason being that controlling the ordering of keywords in the URL matters too.

URL Rewriting

If you care about how your links are built and that you always ensure there is only one way to get to your content and you also care about the ordering of keywords then you likely care a great deal about URL rewriting: the ability to use a published URL that may not be the same URL that the application requires internally.

URL rewriting allows you to take a URL like:


and publish it as:


There are several different techniques for URL rewriting for ASP.NET and this blog post is certainly not going to attempt to address them all.

Simple URL Rewriting

The first technique for URL rewriting is very simple and simply tries to take advantage (game) the path parsing of a crawler. This technique uses a controller through which all requests are sent through and works best for cases where you are hosting the server and do not have the ability to run an ISAPI filter to rewrite URLs (or have access to IIS 7):

— or —

The latter technique is how Community Server constructs URLs, as well as other .NET blogging engines. The first technique is something new that we’ve been experimenting with and tried first with the Telligent Wiki Prototype that runs and (and a few other sites).

In the first example there is an HttpHandler that looks for all requests that use the .axd extension (note any extension type will work). The handler parses the path of the request but only cares about the identifier – in this example the key is 33 – that allows it to pull the content from the database.

The second case is again a virtual handler and loads up the content based on the name of the post.

The difference or benefit from either technique is unclear. However, I suspect that the first example where slashes are used for paths will likely work better for SEO purposes. But that is conjecture.

Advanced URL Rewriting

If you have control over the server or have a more progressive host there are some other options to consider for more advanced URL rewriting.

In Graffiti we actually create files and directories to give users full control over the path vs. virtual URLs as used in Community Server. This has both some benefits and some pitfalls. The benefit is you get very clean paths with no extensions in them. The pitfall is that it does require permissions to write to the disk.

For example, a post titled “ASP.NET SEO Optimization with URL Rewriting” in a category called “SEO” would create:

  • [path to Graffiti application]\seo\aspnet-seo-optimization-with-url-rewriting\default.aspx

The URL would then be published as:


This obviously works very, very well. The default.aspx page internally can store all the details, such as the post id, to quickly look up the post in the database.

Another option is to use a URL rewriting library like ISAPI Rewrite, which happens to work very similar to Apache’s mod_rewrite. This is an ISAPI filter, $99 well spent, that allows to fully control all URLs for your application.

The future: IIS7

Unlike previous versions of Microsoft’s Internet Information Server, IIS7 will allow for ASP.NET HttpModules to perform exactly the same tasks as ISAPI Filters. This means that you could write an HttpModule for handling all your application’s URLs (similar to ISAPI Rewrite) all with .NET code!

Furthermore this also means you can do things like use ASP.NET Cookie Authentication to authenticate access to any resource (images, html page, etc.). Something that isn’t easily accomplished today. Taken one step further: this also means that you could have .NET code authenticate requests that were served from a PHP application running in IIS!

If you want to read more about URL rewriting for ASP.NET check-out Scott Guthrie’s article

Next Tip: Titles & MetaTags (not yet written)


  1. Rob, you deserve applause for these great posts on SEO! I’m very happy that Telligent puts a lot of effort into making Community Server SEO-friendly.

    Do you have plans to improve SEO in the Forum application of Community Server too?

  2. Is this still an issue in Community Server’s SEO?…/communityserver

  3. How much difference do friendly URLs *really* make? One or two places in your SERPs at best?

    I’m not going to try and deny the value of keywords in the URL, but I do find that some of the more obscure technical mechanics can sometimes overshadow the more important meat and drink of good content for SEO!

  4. Rob — to bump an old thread — I would be happy to talk with you about some of the intricacies on CS 2008 that could be improved in regard to url structure and seo implications. Content is king, but the right foundation plays a role. Google’s recent adoption of the canonical url meta tag( gives some serious ability to make improvements with marginal modification/effort. I worry about doing too much hacking on an install and what that does to future upgrades.

  5. Regarding the comment about the extent to which keywords affect ranking – even if the introduction of keywords doesn’t change rank, a visitor will click on the link he feels is most relevant, e3ven if it’s not ranked at the top. Adding the keyword to a URL can improve the perception of a page’s relevance over a similar page that has a big, ugly querystring. We have observed this to be true over the last three years on a 15K page site.

  6. yes absolutely right we have to do some URL rewriting to make website more search engine friendly

  7. Hi Rob,

    This is great reading. Just a shame we didn’t see any of this in release 5.0! Or did we?