Advancing Life & Work
1752756 Members
4937 Online
108789 Solutions
New Article
Curt_Hopkins

(VIDEO) Powering the digital economy: technology’s unseen energy challenge

tanner-boriack-538052-unsplash (Custom).jpg 

By Curt Hopkins, Managing Editor, Hewlett Packard Labs

Today, the Internet already absorbs seven percent of the world’s electricity. By 2030, 30 billion connected devices will be in operation and that percentage will increase substantially. In order to continue growing compute and storage, changes will need to be made.

Hewlett Packard Labs has already made some of those changes, is in the process of changing more, and is strategizing to keep pace with the future needs of computing growth.

In the video below, “Powering the digital economy,” Labs’ Rebecca Lewington, analytics and advanced architectures marketing specialist; Cullen Bash, VP systems architectures lab; HPE Chief Sustainability Officer Christopher Wellise; and Lin Nease, HPE’s chief technologist for IoT, talk through these demands and the innovations they have led to.

Labs’ approach to the overall problem of data energy demands is Memory-Driven Computing and edge computing, according to Bash.

The edge, AI, and photonics

“Computing at the edge means not having to spend the energy to ship packets of data back and forth,” Bash says, an idea emphasised in the video by Wellise.

Internet of Things, and other edge computing applications, reduce one of the greater consumers of energy in computing, transportation of data. The energy required to push data through wires is not inconsequential and it takes much longer to do the job than if you do it at the same location where the data is gathered. That means the computing devices doing both the transportation and the processing have to be on for longer, draining still more energy from the grid.

One of the tools to get around the drag of traditional data transportation is photonics, something Labs has no small experience with. The benefit of photonics, according to Lewington, is that the cost of transporting data 1,000 meters is not greater than pushing it a millimetre and that you can increase the data load to the limits of your bandwidth without straining the system. 

“It costs a lot to push electrons through metal wires,” says Bash, “overcoming resistance costs. But moving light through fibre costs almost nothing.”

Although artificial intelligence is in its infancy, it may also have a role to play in reducing energy costs. If you train a neural network to control a specific system, such as manipulating actuators to control a data center cooling system, you may be able to increase your efficiency. Given the training demands of a system, whether it would produce a net savings would depend on the energy required to get it operational.

Memory-Driven Computing

“Re-architecting computers in favour of data is now central,” says Bash. “Memory-Driven Computing is faster, more efficient, and more flexible.”

A Memory-Driven Computing system has a number of elements, all of which contribute to energy savings. Among these are the aforementioned photonics, non-volatile memory, application-specific accelerators, and of course memory at the core. Memory at the core also saves energy because moving data off-chip takes much more power than moving it “on-die,” or within the transistor.

This improvement is not theoretical. It proved instrumental to research by DZNE (German Center for Neurodegenerative Diseases) which was conducted first on The Machine prototype and now on HPE Superdome Flex, which is the latest consumer expression of the prototype’s innovations.  As Lewington points out in the video, this new HPE server proved a 100-time improvement in speed over DZNE’s earlier computing as well racking up a 60 percent energy savings.

 Embedded energy

The phrase “embedded energy” describes the amount of energy that is used to make a process happen, including fabrication and manufacturing, physical transportation, assembly, repair and replacement, and every other aspect beyond the flipping of the switch. Embedded energy is its own field of study and going into detail here is beyond our scope. But we should touch on it as it is not inconsequential.

Edge computing means less energy in creating and installing wire. Photonics means less transistor manufacturing. Memory-Driven Computing’s biggest contribution to reducing embedded energy is, according to Bash, “breaking down memory walls. Prior to this system, in order to grow memory we had to add CPUs.” Again, that means less manufacturing. As we move forward, Bash says, we have to continue to “right-size” computers.

The practice of moving from a product-oriented computing to computing as a service has a great deal of promise as well, according to Labs’ chief architect, Kirk Bresniker.

Bresniker has been developing arguments to address possible new consumption models. These, he says, focus on the “dematerializing and precision assembly of resources, whether that is out in a distributed edge device or in cloud or hybrid data centers or about ‘masslessness’ enhancing innovation speeds for both producers and consumers of infrastructure.”

These arguments are not primarily about energy consumption, but they suggest possible avenues for future investigation.

Data thirst and innovation

Data’s thirst for energy is an aspect of computing that needs to be addressed in parallel to everything from speed to storage to cost to the end of Moore’s Law.

Some of the most intriguing innovations may well result from the reality of our electric energy consumption. It has already figured in the creation of accelerators, photonics, Gen-Z fast fabric, and Memory-Driven Computing as a whole.


Photo by Tanner Boriack on Unsplash

Featured articles:

 

0 Kudos
About the Author

Curt_Hopkins

Managing Editor, Hewlett Packard Labs