Nicholas Carr is a great writer, and his books thoroughly deserve to be best-sellers. His 2008 volume “The Big Switch”, subtitled “The Definitive Guide to the Cloud Computing Revolution” is prefaced by pages of enthusiastic reviews. His thesis is that cloud computing will become a utility in the same way as electricity – centrally provided, cheap and transformative for society. He draws on the history of the electricity industry in the US in the late 19th century, predicting the appearance of the cloud-based “World Wide Computer” will make using computer applications as simple as plugging an appliance into a power socket. In a 2013 Afterword he observes a change in the attitude of the business world to cloud computing from “sneering skepticism” to “bubbly enthusiasm”, and quotes a McKinsey report estimating the cost of buying a new server to be three times the price of obtaining the same computing capacity remotely. He also notes the foundering of once-buoyant hardware businesses of major vendors, and equates the proliferation of mobile devices such as phones and tablets with electrical appliances which proliferated after the building of the power grid, observing that they draw most of their value from online data stores and services, and lose much of their utility if disconnected. Many mobile devices currently sold have no facility for connecting to local computing or storage devices: they only communicate wirelessly with the cloud. One less socket is also one less thing to go wrong.
So how could he be wrong?
As businesses and individuals typically use dozens of applications, it seems unlikely that all of these will be available as Web applications, even if they use data stored in the Cloud or services available there. This means that cloud applications are not likely to be nearly as pervasive as electrically powered appliances which only need to be plugged into a power socket. The prevalence of DC powered devices means that differences in voltage and frequency of AC electricity supply (varying most dramatically between the 110 volt USA and the 240/50 volt Europe) are easily accommodated by most modern power supplies, allowing the same device and power supply to be used in most countries. The computing analog to this situation, where any software can be run efficiently on any device, seems very unlikely to come to pass. Emulators give some capability for running software on non-native operating systems, but they usually require considerable computer expertise to install, and may not offer complete functionality or expected levels of performance. It seems as though the transformation of human activity brought about by the provision of electricity as a utility won’t be brought about by cloud computing
However, this doesn’t mean that access to data and services via the Internet hasn’t dramatically changed things and won’t continue to do so. Access to remote server web services means that colossal computing power can be brought to bear on any problem very easily – the best example being Web search. It’s estimated that Google have about 1 million cores in its server farms around the world, many of which are involved in Web searches from any device anywhere in the world. Facial recognition is a problem deemed computationally infeasible 20 years ago. Now, it’s a commodity. The power of multi-level neural networks is being applied to Artificial Intelligence problems in many domains in a similar way, using massive remote resources. So in the future, you can expect to do things at home at little or no cost, that are currently only possible within well-equipped research labs. Like electricity, you’ll only notice Internet connectivity when it’s not there, but don’t expect a revolution.