We have a ASP .Net Application (built for 3.5 SP1) which when compiled is done so in Visual Studio 2008 using the "All CPU's" option. It is currently hosted in a Windows 2003 (32bit) IIS6 environment (Virtual server) and connects to a SQL 2008 (64bit) Server. The current Application Server is running x2 Xeon E5520 CPU's @2.27GHz with 4GM RAM.
With this current setup, the application performs as well as it should. Recently I have setup a new virtual server running Windows Server 2008 R2 (64bit) and IIS7 running on x2 Xeon E5530 CPU's running @ 2.4GHz with 6GB RAM. I have setup our existing .Net application on this new server which is still connecting through to the same database server.
Unfortunately though, for reasons beyond my understanding our application performs really poorly on this new server (which when looking at the specs should operate better than the old server)??? Pages seem to take twice as long to load (possibly taking longer to query the DB..?) etc..
Could anyone provide any insight for me that might indicate why this might be? Our networking guys profess that the new server is setup exactly the same as the old one, so I can't see it being an issue when the application server not being able to access the sql server on the same ports etc.. as the old 32bit server.. all very strange :s
Cheers
Greg