This column is supposed to be about ideas (hopefully ones that relate somehow to computing), so in the interests of credibility I can no longer avoid making comment on the biggest idea of this century, global warming.

I’m not a global warming denier – on the contrary, as a keen amateur naturalist, I’ve noticed something wrong with the seasons for a decade or more, and I do keep up with the climate science quite keenly. There’s no doubting now that digging up and burning fossil hydrocarbon deposits, combined with chopping down too many trees, is altering the energy balance of the planet and causing it to heat up. What there’s plenty of doubt about is how far and how fast the process will go. I’ve a nagging fear it may prove to be highly non-linear and will surprise us (not in a good way) rather sooner than expected.
The massive increase in fossil fuel burning is a direct consequence of the energy demands created by the industrial and scientific revolutions: these led to the internal combustion engine and mass transport, a larger world population thanks to mechanised agriculture, better hygiene and medicine, and the generation of electricity as a medium for distributing energy. So the other, er, burning question is to what extent can technology correct the problem that it’s helped to create? Computers are a very late addition to this list of culprits, and not major consumers of electricity when compared to lighting, heating and transport (although their manufacture adds quite a lot to the load). On balance, I think it likely that computer technology, properly applied, could save more carbon emissions than it creates, although whether it will be properly applied is a whole other can of worms.
Technologists of many kinds are studying the problem hard, and have come up with many tricks both to generate energy from non-fossil sources and to prevent its waste: Scientific American September 2006 contains a good summary of the state of the art. We can now build houses that require zero energy inputs to heat them, and vehicles that use a fraction of the fuel of current models. The problem that remains is inertia – physical, commercial and political.
The world is already full of things, and you can’t just snap your fingers and replace them all with new, more energy-efficient things. Even if you could, who will pay for them? And you still have to persuade everyone that it’s necessary.
Computers come into this story at many different levels. They’re needed as control systems in energy-efficient hybrid cars and direct-injection bio-diesel engines. They’re needed as design tools to create more efficient, low-drag devices of all kinds from aeroplanes and lorries to wind-turbine blades. The climate models that proved global warming is taking place would be inconceivable without computers, and they continue to monitor and refine these models. But there’s a far greater potential role computers could play.
You see, my gut feeling is that all these advances in energy conservation, generation and utilisation won’t prove to be enough in the long term. Climate change involves heating such huge amounts of matter that the process is slow and exhibits huge inertia, so it’s likely it’s already gone way further than current models tell us, and that reversing it will be like trying to do a handbrake turn in an oil tanker. I’m not predicting that Earth is going to turn into Venus and wipe us all out in a deluge of sulphuric rain, merely that the weather will become rough enough to severely disrupt the economy and obstruct the efforts needed to cure it.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.