Choosing workstations

Let’s turn our attention to the fate of the humble workstation. The life of a modern PC on a network is a sharply bipolar one, with, it seems, little or no middle ground. At one end of the spectrum, you have the totally locked-down, Stalinised clone PC with no individuality, its users barely able to distinguish one PC from another, the sole reflex twitch of freedom being a sad little Vicky Pollard effigy stuck to the monitor with Blu-Tack. At the other end, you have the fearless road warrior’s laptop, secured only by his fingerprint, which is free to roam courtesy of at least four distinct connection methods – wired and wireless Ethernet, Bluetooth and infrared – and which is backed up once a year, if that often, using a chipped coffee mug full of USB flash drives…

Choosing workstations

It’s the disasters that can befall the latter category that feed the paranoia which creates the former. Most of the nasty things that imperil laptops or “self-maintained” PCs, such as being infected, stolen or dropped down the stairs, never trouble those corporate lockdown machines. And most of the actions that are prohibited on lockdown systems merely annoy or hinder legitimate users rather than confer any defence against the problems they actually tend to suffer from.

My own approach to the run-of-the-mill company network PC is rather schizophrenic. On the one hand, I utterly despise the idea, often found in corporate purchasing departments, that PCs are commodities as interchangeable as office chairs or reels of Sellotape. Every large corporate network I see contains machines bought on this basis, and I’d guess that the false assumptions that underpin this notion probably shorten their reinvestment cycle by at least three years, in some cases six. Yet the guy in charge of purchasing these systems will often be intensely proud of his BMW parked outside. Whenever I get the chance, I needle these guys into defending their rationale, and it’s astonishing how emotional they get when asked to explain why they equip their users with the computing equivalent of Trabants.

On the other hand, though, I do believe that interchangeability is a good idea. If your PC dies and you can just stroll across and log in again at a different desk, the loss of business revenue is much less than if you need a rebuild that can take two or three full working days, making it well worth the investment to set up hot-desking in the first place. This should also be a strong motivation for attempting to herd together all your laptop users and getting them to provide an occasional image backup of their whole hard disk. This is almost always a tough job no matter how big the company, because the maverick laptop users are almost always senior and therefore highly egotistical.

So when I’m building or specifying a set of workstations, what do I look for? These days, there are only a few things that matter when choosing a general-purpose “office automaton” computer. Whether you’re buying for XP or considering the arrival of Vista, it’s easy to hit the right spec (so long as you realise that it’s equally easy to end up with the evil twin if you listen to too many adverts).

• Sufficient memory It’s so cheap these days that if you find a vendor that’s deliberately included exotic RAM in the spec, mark it down in your list. Dell, Compaq and IBM went there a few years back over RDRAM and the result is quite clear – unless you intend to buy all the RAM you’ll ever want on the first day, proprietary RAM is a nightmare. This is where your money should go before anything else is even thought about, and 1GB is a good working minimum for a new PC. With second-hand or PCs recycled inside your business, aim for 1GB and settle for 512MB.
• A small fast disk 40GB is enough for normal business use and a good 7,200rpm IDE drive of that capacity now costs £30 or less, and will produce an immense speed hike for any older PC that’s still in business.

• Discrete, branded onboard peripherals A terse expression that covers a quite tricky technical point. Every so often, system designers will come up with a wacky way to reduce materials cost by aggregating the jobs that each component does, or else they discover some smart-ass hardware company that has a bright idea for emulating some component on the cheap. It was such approaches that brought us classic examples like those bottom-end Compaq servers that combined Ethernet, display and SCSI controllers onto a single PCI card (which meant all three ran at speeds more suited to a mobile phone than a file server). There are examples in the workstation market too – Dell ultra-compact desktops that contain laptop disk drives spring to mind, or some middle-aged Dell PCs that tell you they have a 3Com Ethernet card, but it isn’t supported by any drivers on the 3Com site (try finding a 3C920!). The right kind of desktop PC is small, because you won’t be adding cards to it, contains a quality chipset and BIOS (and the two are distinct) and lets you add up to 1GB of RAM plus the disk of your choice. HP manages to do far better with the same parts, speed-for-speed, than almost any other vendor I’ve tested, which is why my basement at home sports half-a-dozen old HP e-vectras.

• Sensible heat management It’s astonishing that this should still be an issue, but it’s worth bringing up. Very small cases that have howling fans aren’t popular, but neither are big cases with silent cooling that fill up with fluff or cease to operate without triggering any alarm.

• Bundled software It’s astonishing how many corporates have been buying workstations with 128MB of RAM and the full Symantec/Norton security suite, which will barely fit into 384MB and still leave you any room to work. Yet vendors are happy to recommend such a spec, which leaves a Pentium D gasping and panting along no faster than the sat-nav in your car. Could it be that they have an interest in frustrating end users with these underpowered devices? Why would that be?

Refurb is your friend

Some of you may be surprised to read the bits that concern refurbishing existing machines as opposed to buying new. I know that many purchasing managers, and indeed vendors, hiss sharply through their teeth when this topic comes up, because the very idea is a threat to both sides of their particular relationship. Budgets must be kept at the same level year-on-year and commissions ditto, so neither party is going to benefit from a bulk buy of 40 hard disks, 20GB of RAM in 512MB sticks and a couple of battery powered screwdrivers. Yet the simple fact remains that most businesses are now throwing out 1-1.5GHz workstations, which commonly have a 20GB drive, 128-256MB of RAM and run Windows 2000.

If you’re in this situation and about to go shopping, first do some maths: £25 for a modern drive and another £30 for some memory (in bulk) to take it comfortably over our 512MB baseline, plus an XP Professional licence, which you may already have. Now take that machine and build it completely cleanly, then go to the vendor site and collect the most up-to-date releases of BIOS and drivers (not forgetting the Intel Chipset updates and the Application Accelerator from www.intel.com if it’s a real Intel product). Now sit down and try it out. Is it unacceptably slow for regular office work? Wouldn’t you rather give 70% of your workforce this much computing power and save the budget for some really fast boxes for those people who actually need them?
Or to look at it another way, with all the money this will save couldn’t you buy a massive network upgrade to a core Gigabit backbone and some faster servers? Wouldn’t that be money better spent than buying dual-core Pentiums for secretaries who’ll principally use them to watch the animated adverts on Hotmail wriggling 10% faster? Markets and purchasing rationales change by spurts and bursts, and this refurb trick can’t be continued indefinitely. If, as I’ve mentioned above, your current stock is mostly RDRAM-equipped machines, you have a bit of a problem. I tend to throw out the scruffiest 50% and stuff their RAM into the remaining half in such a situation.

It is, nevertheless, a great way to save money that otherwise would be spent on machines that may well – due to faddish tricks and scrimping that’s beyond your control – run slower than a properly configured example of their predecessor. I can hear the rumbling from computer-room basements all across the land: surely Cassidy’s schizophrenia has run out of control? Isn’t it better to make sure all your kit is identical for all the users? How can he preach a doctrine of hot-desking and roaming, but then advise people to segregate their kit into different performance levels?

This sounds like a well-constructed objection, but it’s rubbish. Computer facilities people suffer just as badly from this schizophrenia as I do, but the difference is that they don’t know it yet. A large percentage of organisations run two or more domains – sometimes many more – yet when you quiz them as to why, they can’t rightly explain. These same organisations may rigidly enforce just two hardware configurations – desktops and laptops – and because the laptops are used by short-tempered mavericks the split is almost always 99:1. I suppose it’s justifiable to maintain a façade of reluctance if the deviation from The One True Way is perpetrated by people senior enough to demand special attention, but I stick to my assertion that some mild degree of hot-desking within user groups is a worthwhile target to keep in mind. And Microsoft agrees with me, which is why you can have Group Policies that pertain to Groups (could that be where the name came from?)

This is a perfectly workable scheme, once you start to think of groups as being people who share a hardware configuration rather than being people who work in the same operating division, and so it’s far more constructive to have a single group called Secretaries than it is to have two called Marketing Secretaries and Production Secretaries. But it’s a shift in the way the network is matched to the business that many administrators will find rather uncomfortable.

We’ll go a-roamin’

Roaming Profiles that move within a group of users are alot easier to maintain and account for than a giant bucket of untuned, uncontrolled Roaming Profiles maintained under the banner of soviet equality. This has become especially true lately, since several popular applications have taken to using formerly sleepy regions of the “Documents and Settings” tree to store quite large chunks of irrelevant data.

I tripped over my own doctrine recently, having set up a widespread group of roaming users on a lot of refurbished machines with smallish drives split into two partitions. Having delivered a stern lecture on the vulnerability of Internet Explorer to various infectious websites, I followed that up with a deployment of Firefox and the latest Sun Java. However, once the users indulged in a bout of reading PDF files during a visit from a wiring contractor, the combined efforts of Sun and Firefox’s cache with Adobe’s new-found habit of downloading updates on a user-by-user basis, added some 60MB to each user’s profile directory. It took only a few weeks of these updates from Adobe, plus the guys’ habit of logging into their own account whenever they take a phone call at some colleague’s desk – courtesy of XP user switching – to substantially consume the smallish partitions we’d set up several years ago as a performance booster. Pretty soon, files were failing to unzip due to a lack of temporary directory free space, and page files were peaking sharply and stopping machines dead in their tracks.
Folder redirection by way of some careful tweaking of the group policy settings was the right fix for this sneaky little space hog. Strange that so many diverse and recently updated system components should all want to take up space in the same part of the directory tree. Could it be that hot-desking isn’t actually anything like as widespread as we’re lead to believe?

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.