The mirage of infinite bandwidth
© Copyright 1994-2002, Rishab Aiyer Ghosh. All rights reserved.
Electric Dreams #76
23/October/1995

In the rush towards globalisation, the force behind it - to decentralise - is often neglected. This force also results in a process of localisation - the gathering in that complements the reaching out. This is most apparent in the future world of knowledge - and has its analogues in the technologies of today.

Knowledge is rather like electricity. It does not exist in ancient stores that can be mined like natural resources, it has to be generated. To be useful, knowledge has also to be transmitted, reaching far beyond its source. As with electricity, such transmission has its disadvantages. Sometimes it is best to store or generate knowledge locally - as with batteries and solar cells.

In a global economy, where companies in Britain have their accounts processed halfway across the world in Bombay, a natural extension would be the globalisation of services down to the level of individual consumers. Companies have their accounts processed in far away places; individuals can have their letters spell-checked by a word processor somewhere else. So natural can this extension seem, it has been suggested that people will do away with the powerful personal computer altogether. Instead, they would use typewriters with pretty pictures, while some hydro- electric power station of a supercomputer - with constant professional maintenance to keep it up-to-date - does all the real work. Bandwidth - to transport information between these huge servers and their tiny clients - would be infinite, and therefore free, completing the perfect picture.

Nothing perfect works, and nor can this. The model of infinite bandwidth for clever servers and dumb terminals ignores the trend towards decentralisation inherent in the knowledge revolution. Typing here for action there resembles distributed processing in action; but when a few servers respond to hundreds of clients - as opposed to hundreds of "servers" responding to one "client" - then this is an old model of centralisation in a new guise.

The argument against the dumb terminal model is not just one of underlying trends - these trends do manifest themselves, however slowly. The main argument is that bandwidth will never be infinite - it is, after all, a matter of physics, of wires and airwaves. What will be infinite is knowledge, an altogether more amorphous substance. There is a germ of truth in the free bandwidth theory: the cost of bandwidth does indeed tend towards zero. But so does the cost of computational processing power. If the general complaint is that software applications grow in step to use every increase in this power, much the same will happen with bandwidth, and word processors will always want more of it as they want more memory and processing speed today.

Of course, the present model of selling software - where one physically takes possession of a knowledge product, at which point it usually becomes obsolete - cannot last for long. Giant central servers will not be the answer, though they may serve a purpose. More likely will be another electricity-inspired model, the power grid. This would supply capacity on demand, drawn from widely distributed sources.

Some power sources for such a grid could even be those giant servers, probably for specialised tasks. More in line with the principle of decentralisation would be the large number of smaller servers intended for local use, but sharing spare capacity with the grid. And there will always be the batteries of personal computers, growing ever more powerful.




  • Electric Dreams Index
  • dxm.org Homepage