Avalon on XP

Afew months ago, I mentioned Microsoft’s climb-down over the forthcoming Longhorn release of Windows. In essence, Redmond woke up to the startlingly simple fact that almost no corporate customers were likely to upgrade their desktops to Longhorn in any timescale before 2010. That’s because XP does a perfectly good job today, and there’s a lot of push to get server-side work completed, and all the new promises for Longhorn amount to not a lot unless the server-side infrastructure is in place to make it rock and roll.

Avalon on XP

The recent announcement that Microsoft will basically back-port two of the most important components of Longhorn to XP means that the target has now changed again. By porting the Avalon 3D compositing graphics engine and the Indigo Web Services onto XP and Windows Server 2003, Microsoft now has a chance to bring this technology to the marketplace in a way that has some hope of actually being deployed and used. It therefore became a key milestone to see at what point some of this Longhorn technology would be released onto the XP platform, even in an early alpha or beta prototype form.

Well, I’m pleased to say that it hasn’t taken long for the Avalon 3D graphics group to come up with a release of its technology for XP. The Avalon Community Preview (ACP) is a long way away from a production release – indeed it’s quite a long way even from beta quality, since it’s still under a considerable development effort. There are still lots of holes that need to be filled, but the Avalon team isn’t over-hyping our expectations and is being quite open about what we should expect from this release.

Getting ACP installed and running was far less hassle than I thought it would be. Given that it was quite likely that this development environment might nuke any XP machine onto which it was installed, I chose a suitable victim computer running XP SP 2. I downloaded the current beta build of Visual C# Express Edition and installed that. I then downloaded the ACP code and installed that onto the system. Installation was in two parts – the ACP system itself and the WinFX SDK (Software Development Kit).

At this point, I was ready to try out some source code, so I downloaded a demo package from the web – from blogs.msdn.com/danlehen – which is the good old bouncy ball running in a 3D-rendered space, complete with reflections. To be absolutely honest, I never really expected that the code would compile and run. When you start running beta development tools together with pre-alpha 3D graphics engines, your expectation of success needs to be right down there at the sub-miniscule level.

But in fact, it ran… It took a few seconds for the Avalon engine to come to life, but then up popped the window and the ball bounced around in glorious rendered 3D space. It could have been a little smoother and faster, but given that this is completely untuned code, I was just grateful that something, anything, worked. Obviously, a bouncing ball is nothing to get too excited about – but I’m now really looking forward to new-generation user interfaces that will make appropriate (and hopefully tasteful) use of this technology.

It’s going to be fascinating watching how this progresses over the coming months as we move towards a first beta build of Longhorn, and also to see how portable the code is between the Longhorn and XP platforms. Still, for the time being, it’s clear that Redmond’s development teams are delivering on their promises, and for that they are to be applauded.
Jet And E12

A few months ago, I had a rant here about the fact that Microsoft’s storage strategy was in tatters. The delays in WinFS, the unknown status of WinFS for Servers, the acknowledgement that the forthcoming Data Protection Server is just fine if your definition of ‘data’ is limited to stuff held in the file system, and that it will have no idea about SQL data or Exchange Server in its first release. Well now, there’s a final bombshell.

At the recent IT Forum held in Copenhagen, I had a series of meetings with some trusted senior Microsoft people – I’ll keep them anonymous to spare their blushes – at which I managed, through a process of digging and asking nasty questions, to establish a few things about the next version of Exchange Server. First, it’s no longer codenamed Kodiak – its new name is E12. Second, although it’s been developed to run on top of the forthcoming SQL Server 2005 ‘Yukon’ release, it’s also been developed on top of the existing Exchange Server ‘Jet’ engine. And as far as I can tell, the decision has been made to release E12 for the Jet engine, not for SQL Server.

There are several reasons for this – first, the Exchange Server team wants to have a beta product out in March with a product release later in the year. Given that, the ongoing delays of SQL Server makes it very risky to bring such a huge product as Exchange Server to market on a brand-new database engine. This is a perfectly logical point of view of course. In addition, we’re not sure about the performance of SQL Server when hosting Exchange Server-style data. I’m told by one source that in the benchmarking tests conducted last April, Jet was still ahead of SQL Server in terms of performance. Another source told me that this was only to be expected, because at that point the performance testing had mostly been focused on the RDBMS parts of the SQL Engine, and that considerable tuning had taken place since then on the semi-structured data parts, which include the very areas that Exchange Server would use heavily.

I’m not sure what to make of all this. On the one hand I can see lots of good reasons why the Exchange Server team would prefer to take a careful migration approach, and to ensure that E12’s engineering time and effort is focused on the immediate needs of the Exchange Server user community. Given that point of view, sticking with Jet for one more revision is the sensible, considered solution. On the other hand, I can’t disguise my disappointment. First of all, it blows a hole through what little remains of Microsoft’s long-term storage technology strategy. Second, it means it will be another two years or so before we can really make use of the Exchange Server store in a distributed and joined-up information-mining environment. Worst of all, it clearly shows that Microsoft doesn’t really grasp the simple nature of the problem that’s facing the IT director today.

In a modern Windows network there are four different data platforms that need to be looked after: the file system, Active Directory, SQL RDBMS and Exchange Server. Each of these needs its own backup, disaster-recovery and archiving procedures. That tots up to four times three, in fact no less than 12 quite distinct and different data-management problems that need to be pondered, solutions found, systems tested and staff trained for. Reducing that count by just one platform would knock down the number from 12 to nine in one swoop.

Now we know that Microsoft is considering putting Active Directory onto the small-footprint ‘MSDE’ SQL Server engine for the R2 release of Windows Server 2003 due late next year. So that’s potentially one platform down, although it isn’t a terribly important one due to the fully distributed nature of Active Directory – you’re pretty unlikely to lose all the Active Directory boxes in your infrastructure in one go. Putting Exchange Server onto SQL Server would have taken out a second, reducing the problem space to just two platforms – the file system and SQL store.
Now this seems to me to be a big issue, and something worth fighting about. The alternative is to drop all data onto something like a NetApp Filer, which is a native store for all these data types, and then let it take care of everything in one place. This is certainly a solution that finds a lot of traction in the larger enterprises, but the danger for Microsoft is clear – it runs the serious risk of having no part to play in the storage marketplace of the future. This means it won’t be able to lead in that space, and that would throw up significant questions about its ability to persuade customers to move to WinFS Server edition in the future.

I expressed all of this to the various Microsoft people in Copenhagen, which brought on long faces and furrowed brows. Serious discussions are being had at the moment in Redmond. It might be that they decide to release E12 as an R2 release of Exchange Server 2003, which is the approach I’d favour, with a follow-up of the full SQL Server version coming later. Whatever happens, Microsoft has to demonstrate that Exchange Server and SQL Server do have a future together, and it has to do this right now. In addition, the company absolutely has to get its storage story right, or else it can kiss goodbye to getting things such as WinFS into the big server storage marketplace of the future. The deadline is the promised March opening of the E12 kimono.

Putting on the Frighteners

One of the oldest applications for the 32-bit Windows platform, Diskeeper from Executive Software, has now reached its ninth incarnation. Diskeeper is probably the oldest disk defragmentation program available for Windows NT – indeed, if my memory serves me correctly, it came from the VMS world, which shares the same common father with NT in David Cutler. Diskeeper is the free disk defragmenter that comes in the shrinkwrap of Windows, and has done so almost since the beginning. I understand it doesn’t ship in the German version of Windows, because of some issue with the founders of Executive Software holding religious views that aren’t acceptable in Germany – or some such tale. Nevertheless, disk defragmenting is one of those tasks that we do only occasionally and feel much better for afterwards – a bit like tidying out the cupboard under the stairs. We’re not really sure it’s made any difference, but we feel better for it, so it’s a good thing.

Almost ten years ago, I did some initial benchmarking of drive access times both before and after defragmenting a hard disk, and while the results showed that there was an improvement, it really wasn’t a great deal to get excited about. Under some circumstances and under certain types of load, you could see an improvement. I was interested to see how things have changed after ten years of performance improvements, both in the hard disk drives themselves and in the motherboard/processor combinations, not to mention the vastly increased memory that’s available today.

This time round, the task was made a lot easier by the availability of disk-cloning tools, which could take an accurate and repeatable mirror image of the disk contents, thus allowing me to go back and forth between the original fragmented and defragmented versions. So how well does the new version of Diskeeper work?

Well, it has a new user interface that’s fairly pretty, and it seems to do just the same sort of defragmentation as before. If there have been any changes or optimisations, then they’re very hard to spot. That’s on the plus side. On the minus side, I take strong exception to some of the wording used inside the product’s interface. For example, there’s now a new tab marked Reliability. I ran the Analysis task on a hard disk on my test machine, and it said ‘Reliability analysis results for volume J: Warning! The computer’s reliability is degraded on volume J:’.
Whoooah, that’s the sort of thing I don’t like to read about regarding my computers, because reliability is a non-negotiable matter for me. The dialog then goes on to say ‘The reliability level is set at “Warning” level for the following reasons: 1. The volume is heavily fragmented (15% fragmentation). Recommendations for volume J: 1. Defragment volume J now using Diskeeper’. The dialog box goes on to say ‘Fragmentation often occurs with important files that are used frequently by Windows. When the fragmentation of these files gets beyond a certain level, Windows begins to have trouble doing its normal, everyday work. Crashes and hangs can occur, leaving you open to loss of data and productivity.’

This sort of loose talk makes me seethe. There never have been, and never will be, any reliability issues caused by fragmentation of data across a hard disk. Defragmenting a disk will neither improve its reliability nor improve the chances of a successful backup and restore. This is simply scaremongering of the worst sort. If Executive Software is now reduced to frightening users into using its product under the threat of impaired system reliability, then one can only conclude that the company ran out of real ideas for this product years ago and is just turning the handle to crank out new versions of products that do essentially the same job they’ve always done. This makes me a little sad, as the original Diskeeper of some ten years ago was rather a good product.

I tried doing some Xcopy benchmarking from one hard disk to another, both before and after a full defragmentation. Obviously, I made sure that these were different disks on different controllers. I couldn’t see any meaningful Xcopy time difference between the original and defragmented states, using a fast modern computer, decent disks and a reasonable amount of RAM. Indeed, given how clever NTFS has always been in its disk handling – doing stuff like scatter/gather, elevator seeking and so forth – I’m not really surprised that fragmentation is effectively invisible these days.

Should you consider Diskeeper 9? Well, it certainly won’t make anything worse, and you might feel more comfortable having defragmented your hard disk. So why don’t you download the free trial version and see what you think? Do you see a meaningful improvement in speed? And do you think that abusing terms such as ‘reliability’ is reasonable? Emails to the usual address, please. For myself, I think I’ll pass.

The New Honeybyte

Long-term readers will recognise that the ‘Honeybyte’ is a unit of completely over-the-topness used to characterise the computing power of my desktop machines. Until now it’s only been applied to Windows-based machines, but the lure of the new Apple 30in display proved just too much for me. The fabulous new Apple Store in London is positively evil in the levels of temptation it exposes you to, and so I ended up buying the 30in display together with a twin 2.5GHz G5 machine armed with 4GB of RAM. It will do for the day-to-day work – Virtual PC 7 runs very well on it and I have Terminal Services clients too, so that access to my core Windows network is no hardship. And the AMD64-based workstation is just to my left, ready for instant access at a moment’s notice. After all, I have to get my Half-Life 2 fix!

While I’m on the subject, I’ll just quickly say that Half-Life 2 has blown me sideways with its graphics quality. It represents a significant leap forward in this area. The game is horribly addictive, and when played on an AMD FX64-based computer with leading-edge graphics card, it takes you to a new level of immersive gaming.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.