If, like me, you see no shame in your technical expertise, the style and pitch of some of the cloud sales initiatives presented to small businesses can grate rather painfully on your self-worth.

This has been almost unavoidable while the hardware industry has been bashing out servers best suited to being walled up in an air-conditioned room; you definitely don’t want the grief of maintaining several kilowatts of air con and a stack of batteries because your servers are chunky enough to crush unwary body parts the minute you touch them.
But even five years ago, when cloud started to look like a credible delivery platform, that picture wasn’t wholly accurate. Today, it’s even less so: the smallest servers can deliver sensible resources for small businesses for less than £200, and such devices take up less shelf space and electricity than the sachet-gobbling coffee machines found in MDs’ meeting rooms.
The smallest servers can deliver sensible resources for small businesses for less than £200
So, how should your small business decide whether to invest in its own server or go down the hosted route? What advantages does each offer, and will you pay over the long term for cloud services as your headcount and data storage demands rise? While every business is different, I hope to answer these questions in this feature.
The case for servers
Let’s think back for a second and consider the performance you could buy at the start of the cloud era. The HP ProLiant DL380 G5 was pretty close to de facto standard, and came in with a dual-core Intel Xeon at 2.33GHz and a fair-to-reasonable disk subsystem for around £6,000. It was a rack-mount, and it didn’t like being exposed to the heat of an English summer day; its fans would stir up a racket and, overall, take 600-800W of power to run.
Nonetheless, the DL380 was a sales winner, although it was rarely filled to its theoretical maximum capacity of RAM, disk and CPU. Unusually for HP’s server range at that time, there was no direct crossover to a deskside unit – other, less popular parts of the catalogue could be converted from rack to deskside by swapping a few panels and clip-on wheels.
However, the monster thus created would quickly drive the unfortunates expected to sit in the same room completely bananas. Phone calls would be strained, and concentration would suffer. (I know, since I’d often set up a new DL380 by first running it out on a desk in an IT team room; this would be grounds for strong words if it went on for more than a few hours.)
Contrast that with the recent presentation of the Dell PowerEdge VRTX (pronounced “vertex”) at Tech Camp in Paris, in a gold and beige rococo salon, with birds tweeting in the trees outside the open window. Only at the end did the presenter, who had been speaking unamplified, mention that the VRTX (fully populated with four blades, each offering five times the capacity of a DL380) had been left powered up throughout his presentation.
Software changes
Then there’s the software. Back in 2005, we were bashing away at Windows Server 2003, and only just beginning to understand the 64-bit version. Updates could take you out of action, infections were far from unknown, and many practices and superstitions left over from Windows NT 4 days still dominated management thinking. Complete machine restores from tape in a single-server network were, understandably, the stuff of nightmares.
Today, the proposition is very different. Software developers have had an enormous break from the pressures of quarterly financial results in the shape of an immense recession, when no amount of sexy features would tempt buyers out of their torpor.
This has shown – whether you consider the breakthroughs in Windows Server 2012 or the leaps forward in usability made by the OpenBSD or Ubuntu distributions – that giving a software person a chance to sit and think can allow them to sort out user problems that previously seemed insuperable fixtures of daily life.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.