Apologies in advance for the long post:

Here's my situation:

I'm working on development of an ASP.NET website which functions essentially as a wrapper for data. The content appearing on the site is determined by the URL of the site, and the way the content is presented is the same across all sites. We have a single code-base, and several websites in IIS which all point to the same code-base.

The basic site makes heavy use of the Application[] collection so we need to create a separate instance of IIS for each site. Therein lies the problem. Whenever a major change is made to the site, such as an update to code files in App_Code, every site which references the code-base begins to recompile the site code. Even on a multi-processor server, this is causing severe down-time when and update is made. Sometimes up to 15 minutes before sites start responding in a timely manner again, when dealing with 30-50 sites.

I have looked into this issue in other places, and I've had suggestions to switch over to a pre-compiled binary setup for the sites. I have tried this, however, the performance boost is marginal at best. Even when the servers no longer need to compile the code, a shadow copy is made for every site using that code-base. This means, in my situation, that we have upwards of 30-50 copies of the original binary code-base made as shadow copies. Each set of DLLs totals roughly 24 megs in size.

I've also noted that by shutting off the container for the websites (providing a "service unavailable" message) and then making the updates, the compile time is reduced by about 15 - 20%, but when you take into account the amount of time the site is turned off to make the initial copy, the performance is pretty much the same.

Does anyone know if there is some kind of option in either IIS, or in the code for ASP.NET, that can prevent this down time?

Does anyone have any suggestions regarding coding practices that could avoid such problems in the future?