I just delivered a small 3.5 application to a  customer in compiled form and told him to install the application on IIS. The company is one I've dealt with on a few occasions and they are .NET aware shop where I deal with developers and a .NET familiar IT department. But it wasn't long after I sent my email that I got a call back from the customer who - slightly embarrassed - mentioned that he couldn't figure out how to 'turn on' .NET 3.5 on his IIS 6 Web Server.

This is not the first time this has happened:  It's slightly confusing given the funky version numbering that the .NET framework has gone through with versions 3.0 and 3.5.

The not so obvious 'problem' is that if you fire up a machine that has .NET 3.5 installed, you might be surprised to find that the IIS service panel's ASP.NET does not show an option to select the .NET Runtime of 3.5.

Instead you get the .NET runtime dropdown that looks like the one shown in the figure:

IIS6NoNet35

Where's my .NET 3.5, dude?

IIS 6 (shown above) and prior tie the .NET runtime to a specific Virtual Directory or 'Application' which is actually problematic if you happen to have more than one version of the runtime configured for a given Application Pool. Because IIS 6 configures the .NET runtime at the vritual level it's possible for two virtuals in the same Application Pool to use different runtime versions - if you do, the one loading last will fail.

On IIS 7 the runtime configuration is tied to an IIS Application Pool  rather than the Virtual/Web Application:

II7Versions

which avoids the above problem of multiple runtimes hosted in the same Application Pool by pre-loading the runtime at startup. But on IIS 7 too you won't see a .NET 3.5 runtime selection.

It's all 2.0

So, no the customer didn't do anything wrong during installation of .NET 3.5. In fact that'd be hard to do given that .NET 3.5 installs .NET 2.0, 3.0 and 3.5 in one pass (and which makes for the rather much, much bigger 120meg footprint of the 3.5 runtime install!).

The key to understanding why .NET 3.0 or 3.5 aren't showing up is that both of those .NET versions are running on the core .NET 2.0 runtime. So the core runtime is .NET 2.0 (or 1.0 or 1.1 which are all core runtime versions), while .NET 3.0 and .NET 3.5 are essentially library updates.

You can verify this for yourself if you run a .NET 3.5 application on your machine and you echo back inside of an ASP.NET page:

<%= System.Environment.Version  %>

which on my machine with .NET 3.5 installed shows:

2.0.50727.1434

So you can see that indeed the .NET 2.0 runtime is what's driving the show. .NET 3.5 is merely a set of additional system libraries that extend the 2.0 runtime. And a bunch of tools and infrastructure, but all built on the premise of the 2.0 version of the runtime. In theory you can take the new DLLs in the .NET 3.5 runtime and distribute them with your application without installing .NET 3.5. In theory... this is probably not a good idea as certain pieces of .NET 3.5 require installation and system component support. But it demonstrates the point.

No Problem - or is it?

The version numberings certainly are confusing and while it's probably nothing new to most .NET developers who keep up with the latest frameworks and news, it's an easy thing to miss if you're new or are to busy to follow Microsoft's latest follies in naming and versioning. Certainly if you are just starting out coding with .NET 3.5 without having followed the versioning history of .NET you're not likely to know that .NET 3.5 is not an actual runtime version, but essentially a library revision.

This is especially true for IT folks who are even less likely  know about the funky nuances of .NET versioning. It's one of those issues you run into once and remember from thereon forward, but the first time it might still be a headscratcher that wastes a few minutes of time.