This month, I’ve been very cutting-edge on the one hand, yet very retro on the other. On the cutting-edge front, I had the pleasure of looking at Armari’s Gravistar SR workstation. It was impressive watching it obliterate my carefully crafted 3ds Max benchmark courtesy of the ATi FireGL 7350 graphics card, but also a bit depressing. I’ll now have to make a new one because the Gravistar tears through it at 33fps. For an animation destined to run at a standard 25fps, this is faster than real-time. We could, of course, keep using the same benchmark if we wanted to, by extending the number of frames in the animation sequence to make it last long enough to keep within acceptable timing tolerances. This would be valid from the point of view of assessing performance, but frankly it isn’t quite so much fun watching a system running a benchmark with one hand tied behind its back.
The reason the Armari machine really captured my imagination was also to do with the fact that I’ve been killing time on the commute to and from the office by reading Mike Hally’s Electronic brains: stories from the dawn of the computer age.
Here in the early 21st century, all staff need computers. In the wartime world of the 1940s and the decade or two that followed, it was the other way around. The pages of Hally’s book are filled with visions of machines that needed not just staff to nurse them, but the walls of buildings knocked down to physically fit them in. They had power requirements in the hundreds of kilowatts, but memory capacities in the tens of bits. Hally’s thorough research and interviews from primary sources buck the trend of mythical half-truth surrounding machines such as the American ENIAC (electronic numeric integrator and computer) and produces some great nuggets, allowing true comparisons with modern computers.
The ENIAC hailed from the wartime technology arms race, which sparked the era of constantly accelerating rate of technological change we live in today. No-one at the time considered computers very useful and, in fact, ENIAC wasn’t a stored-program Von Neumann computer. Nonetheless, it was programmable via a system of plug boards, had its own memory and processor and, in that sense, was a real computer. It came into service near the end of WWII for calculating firing tables for the US military: essential for accurate targeting of new artillery designs and one of the few practical uses anyone could think of for this new breed of electronic machines.
Most people who know anything about computing history will already know this: it’s part of the basic fabric of computing folklore. But Hally’s research uncovered some concrete performance figures I hadn’t seen before; namely, that ENIAC was capable of computing about 5,000 operations per second and, when it was finally taken out of service and switched off on 2 October 1955, it had run for exactly 80,223 hours. Let’s take those figures and make the assumption that the calculations in question were floating-point operations. In other words, they involved high-precision fractional numbers: given the degree of accuracy needed for calculating military firing tables this is likely. In today’s parlance, ENIAC was thus capable of a performance of 5kFLOPS (5,000 floating-point operations per second).
So here’s the head-bending part. Imagine those nine-and-an-eighth continuous computing years that the ENIAC put in over the period of a decade, attended by a staff of dozens to program it and keep it going through near-daily breakdowns from failures among its 18,000 thermionic valves. Imagine all the time, effort and love those people put in. This was in the days before printed circuit boards too, so components tended to be wired directly together to form a spaghetti-based maintenance nightmare. These people’s lives were completely bound up in keeping this one monolithic machine churning through its calculations.