Partha Ranganathan proposes energy efficient data centres of the future ( graphic adapted from IEEE Computer illustration)
India-born HP researcher Partha Ranganathan moots ‘nanostores’ combining computer processing and memory.
A radical new way of creating computer circuitry that will significantly reduce energy requirements has been proposed by a senior researcher -- Partha Ranganathan -- at Hewlett Packard’s Palo Alto (US)-based HP Labs. In a cover feature in the January issue of IEEE Computer magazine, he suggests that the confluence of emerging technologies and new data-centric workloads offers a unique opportunity to rethink traditional system architectures in future designs.
Today computers churn the numbers in the processor and move data back and forth between processor, memory and storage. More energy is spent in all this to-and-fro movement than in the actual computation.
Ranganathan’s bright idea is this: Why not cut this energy budget by putting processing circuits and memory on top of each other on the same slab of silicon? Such a chip-memory combo would become in effect, a ‘nanostore’ and he suggests that in about 7 years from now, it could store 1 trillion bytes of memory.
Ranganathan heads HP's Exascale Data Centre project, which develops next-generation data centre technologies. By applying techniques that he had developed for data centre work, Ranganathan arrived at the Nanostore concept-- the co-location of processors with non-volatile storage, eliminating many intervening levels of the storage hierarchy. Each nanostore can act as a full-fledged system with a network interface. Individual such nanostores are networked through onboard connectors to form a large-scale distributed system or cluster akin to current large-scale clusters for data-centric computing.
Ranganathan, a B.Tech ( 1994) from IIT Madras, is an MS ( 1997) and a PhD ( 2000) in Electrical and Computer Engineering, Rice University (US). He has been with HP ( and with Compaq before HP acquired it) from 2000 and is currently Distinguished technologist and research manager at HP Labs in Palo Alto working on Computer systems architecture and management.
Why the great interest in Ranganathan’s suggestions for cutting down drastically the energy demands of large scale compurting? Reporting on the recent development, John Markoff writes this week in The New York Times : “A 10-petaflop supercomputer — scheduled to be built by IBM next year — will consume 15 megawatts of power, roughly the electricity consumed by a city of 15,000 homes. An exascale computer, built with today's microprocessors, would require 1.6 gigawatts. That would be roughly one and half times the amount of electricity produced by a nuclear power plant….The energy cost of a single calculation was about 70 picojoules (a picojoule is one millionth of one millionth of a joule. The energy needed to keep a 100-watt bulb lit for an hour is more than eight million joules). However, when the energy costs of moving the data needed to do a single calculation — moving 200 bits of data in and out of memory multiple times — the real energy cost of a single calculation might be anywhere from 1,000 to 10,000 picojoules”.