Here’s something you already knew: technology gets cheaper. I scarcely need explain the reasons why: new technology is difficult to manufacture, using fledgling and imperfect fabrication techniques; the development costs need to be recouped; demand is relatively low with a nascent market populated by early adopters, so unit prices must be pitched high.


Time passes. New and efficient ways to manufacture the particular item are developed. The public at large cottons on to the usefulness of the widget in question; more are sold so the price per unit can be lower. This generates more sales and the prices go lower still. The new technology becomes a commodity and disappears into the product landscape (or product landfill).

In the 16th century, chronometers were the zenith of engineering technology. An embryonic global market economy needed a reliable way to get goods across oceans: for that you needed to know your longitude on Earth, and for that you needed an accurate clock. The demand was so great that in 1714 the British government instigated the Longitude Prize of £20,000 to the person who could design and build a ships’ chronometer to keep good enough time for reliable navigation.

It was a huge problem, but whole economies and the success of nations depended on it. Brilliant men devoted themselves to it. And when finally the Longitude prize was won by John Harrison, the resulting ships’ chronometers – accurate to around two minutes over a passage of a few weeks – were vastly expensive.

I walked into Croydon IKEA last month and just inside the entrance, into a large basket around 5ft by 3ft and maybe a foot deep, had been tipped thousands of quartz alarm clocks accurate to about a minute per year. Buying one would have set me back 90 pence: a person on national minimum wage would need to work for 11 minutes to acquire one.

That’s the sort of thing that makes my head spin, and I can never work out if I’m excited by the plunging costs of technology or slightly insulted by the cursory way such exquisite inventions are treated.

But the trend continues and the pace is accelerating. A look at issue one of PC Pro shows that in 1994 a 14.4K modem cost close to £1,500. These days they are obsolete and you cannot even buy one, but a search on shows that a 56K model now costs a fiver, and I would be willing to bet that almost all of that cost is in fact shipping and profit margin, not the hardware itself.

Accurate clocks are commodity items; modems are commodity items. PCs have been around for 20-odd years, so why aren’t they commodity items too?

When Bill Gates launched Windows XP Media Center Edition 2005 in San Francisco a few months back, he acknowledged that traditionally, music has been handled by dedicated appliances. But he assured the audience that it could be better-handled in the home by a general-purpose PC. I agree, and have talked before about the fact that my music is all now handled by a dedicated notebook. But, of course, a dedicated notebook running Windows Media Player 10 and nothing else is not a general-purpose PC: the computer has become an appliance.

An appliance, yes. But a commodity, unlike quartz alarm clocks, no. So what is the thing that’s keeping the computer from commodity status? Software. If the markets were set up for it, you could now produce computers for less than £50 that were able to run Windows 3.1 just as well as a £1,500 486DX2-66 circa 1995. And they would be the size of a matchbox. But who wants Windows 3.1? And does Microsoft Word work better than it did in 1994? More importantly, is it worth the extra £1,450?

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.