It was more than ten years ago, but I still remember vividly the excitement of receiving the beta of Windows NT 3.1 and installing it. One of the more amazing things I discovered as I played with that release was perfmon.exe – that incredibly cool performance monitor which enabled me to look closely into the performance of the operating system and the applications running on it. Microsoft has enhanced this tool considerably over the intervening years, and today Performance Monitor is one of the key tools you use to do hard-core performance monitoring of a Windows system. The introduction of the .NET Framework has had some impact both on performance and on performance monitoring. There are some new tools for the IT pro to use, but much of the focus on performance has been aimed at developers, to help them write code that performs well. I could spend days talking about the performance of both Windows and Windows applications, and indeed I have spent days talking about performance. But no matter how long I witter on, the fundamentals of performance analysis have remained largely unchanged for decades.
As all PC Pro readers should know, every computer system depends upon four primary resources: CPU, disk, memory and the network. Any given application, whether Win32.NET or Win32, Linux, Unix or Mac, uses more or less of each of these resources. Generalising about performance is hard, so I believe it’s important to understand the profile of your application or system in terms of overall resource usage before attempting to tune it. For each system, you’ll find that one of these four resources is the limiting factor, the bottleneck. While the specific nature of this bottleneck will vary with the application type, every application has a bottleneck and each bottleneck invites a series of potential solutions. However, the moment you unclog one bottleneck another one will appear – if you add a faster CPU to a CPU-bound system, it might then have enough CPU power but will now be constrained by disk performance. The key things to remember are that every system has a bottleneck, and that fixing one just creates another, so you have to know when to stop – otherwise you’ll spend all your time just pushing the bottleneck around from place to place.
Whenever you talk about performance, you start from these four key system areas: CPU, disk, memory and network and, while .NET introduces some new issues, you always start with the four basics. Use Task Manager to get a simple view of your computer’s performance, or use Performance Monitor for a richer view. If you’re particularly hard-core, you might also use WMI and write your own scripts using VBScript, MSH or C#/VB.NET to retrieve the appropriate counters. Vista will be introducing a new performance monitor application that I’ll cover in a later article.
For the most part, generic application performance won’t be much affected by introducing .NET. For example, an application doing any sort of mathematical modelling is likely to be CPU-bound, whether it runs under .NET or not, while a web application may be short of network bandwidth to communicate with processes residing on other computers such as back-end databases. On the other hand, an application that boils some vast database down to create simple summary reports may well be disk-bound instead. And so on, so at this level tuning your application is much the same under .NET and Win32.
Doing More With Less
A key benefit of .NET is that you write less code to deliver your application, since the .NET Framework takes care of lots of common functions. The CLR also provides a number of features to the designer and developer that are specifically intended to help applications scale well: these include features such as thread pools and database connection pooling.