According to Dell, 90% of company data is written once and never read again.

This arresting claim cropped up in the middle of a presentation from Dell’s Enterprise division, recently given to Jon Honeyball and me. Given our usual style of dealing with such events, the poor devils didn’t stand a chance of actually working though their prepared order of slides, and I’d have to confess that we didn’t even try to stick to the script we’d discussed in the run-up to the meeting.
The poor devils didn’t stand a chance of actually working though their prepared order of slides
But even allowing for our natural tendency towards anarchy, this statement stood right out from the other stuff in the presentation.
It’s an odd statistic. How is that data measured? 90% of all documents? 90% of stored bytes? When they said “ever again” did they mean explicitly retrieved by name, or should we include free text searches in that statistic? How long an interval needs to pass before some piece of data is clearly identified as belonging to the 90%, so that steps can be taken to reflect its reduced importance?
These questions are just the starting point for an issue that demands quite a lot of thinking. It’s a fascinating finding to be offered to you by a vendor of servers, given that so few of the devices they try to sell to smaller organisations actually reflect this “fact” in their hardware and software specification.
What’s more, when larger companies try to make use of the sort of gadgetry that is available to take account of this fact, all too often the tricks involved end up being little more than a nuisance and a source of delight to no-one at all (apart, perhaps, from those who equate control with success).
I expect that if Jon and I hadn’t derailed them so enthusiastically, Dell’s sales guys would have proceeded by trying to talk up the toolkit provided to combat the effects of this “natural law of data”: namely, the provision of hierarchical storage measures based on a larger scale, corporate grade iSCSI SAN, plus supplementary devices such as deduplicators, robot-controlled tape silos and archiving utilities.
Even though there’s been some degree of trickle-down of these categories of product into our humbler networks over the past half-decade, the impact of that 90% dead-weight rule still isn’t something that crops up when we’re considering software purchases or hardware choices. We’re offered Windows Server “Enterprise” Edition, not Windows Server “90% Unused” Edition.
Where this 90% dead-weight rule does get taken into account is perhaps in our other choices: it becomes most important when you’re figuring out which shared drive letters to present on your network, and how you’re going to divide up the company’s business among various folders and drives, what the security groups are going to be – and, most importantly, in your estimates of a reasonably representative daily pattern of work loading.
Buying decisions
Let’s look at just one example of such a decision to highlight the way the 90% dead-weight rule can affect your thinking and purchasing.
One of my clients found itself bumping up against the storage limits of its single-box servers. It’s astonishing how often you find that servers run out of puff at around the 2 to 4TB mark when it comes to presenting shares to the LAN.
This particular client is a big fan of HP’s ProLiant DL580 G5 series servers (de facto standards don’t come much more widespread than this one) and the G5 employs HP 2.5in SAS drive units as the standard bricks for building up logical drives.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.