Flicking through the reviews and the adverts in this very magazine, you cannot fail to notice how the computing industry has become obsessed with speed: this machine is 5 per cent faster than that machine; this hard disk has a bigger buffer than that one; this printer can chuck out an A4 photo in 20 seconds. Granted there are a few areas of computing where such raw speed is important: maybe you are into digital photography, in which case the speed of your printer does matter, or perhaps you have a fascination for complex mathematical simulations, in which case a 5 per cent increase in processor speed might equate to getting your results an hour earlier. Or, more likely, you are a fan of that one activity where raw machine speed is still incredibly important: playing games. Hardware manufacturers must love games developers, because they continue to drive much of the demand for faster PCs and components.
But is this need for speed true of the corporate world too? What about the stuff that goes on inside the firewall? Is speed so important there? Actually, desktop PCs bought two years ago are typically plenty fast enough for the usual office tasks, and as a bonus they are usually quieter and less power-hungry than the machines on sale today. Much the same is true for servers. However, if you check the reviews and ads in the Enterprise section of PC Pro, you will see that speed is still seen as a major concern for corporate purchasers: this new tape drive is 30 per cent faster than the last model; this new network switch has a backplane that handles even more gigabits per second.
Well, please bear with me for a few paragraphs, because I’m going to be slightly controversial by suggesting that this lust for speed is not just unwanted; in some cases, it is actually a retrograde step. I can probably best illustrate this by a few examples, the first of which is our old friend, the backup. Once upon a time, when you backed up a server to a local tape drive or to a network storage device, the backup would quietly chug away for an hour or two. But not any more: with our high-speed tape drives and our gigabit network switches, the backup device now sucks data off the remote clients at a tremendous rate, and in the process puts a hefty load on the CPU and disks of the hardware being backed up. I have several clients who have scratched their heads for ages trying to work out why their website starts giving time-out errors at about 1am each morning, and you should see their faces drop when I ask them what time the server backup is scheduled to run.
It is not just backup either. Three or four years ago, a company’s local area network topology would have been such that bunches of local machines were connected by 10Mb Ethernet to a hub, which in turn might have been connected through a cascade of switches, and then finally via a 100Mb connection to the server. This allowed all users to get their fair share of the server and no single user could monopolise the network resources. Those days are gone, though, because nowadays everyone onsite has a Gigabit Ethernet connection from their desktop PC straight into the backbone switch, and so can access the server at its native wire speed. The consequence? One user doing a long file search or large copy can cripple the server for everyone else onsite. We are now getting to the silly stage where we have to put handfuls of network cards into each server, not for reasons of redundancy and availability, but just so the server cannot be swamped by a single user.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.