Behind the scenes @ Labs
cancel
Showing results for 
Search instead for 
Did you mean: 

At IEEE’s Rebooting the Computer Conference, A New Economy of Memory Abundance

Curt_Hopkins

kirkbeardbig.gifKirk Bresniker at Hewlett Packard Enterprise Discover 2015

By Curt Hopkins, Managing Editor, Hewlett Packard Labs

I must Create a System, or be enslav’d by another Mans
I will not Reason & Compare: my business is to Create

                                     – William Blake, “Jerusalem The Emanation of the Giant Albion”, 1804-1820

“Rebooting computing—that is, finding a renewed source of scalable computing performance—requires revolutionary thinking,” said Sumi Helal, editor in chief of IEEE Computer magazine.

It’s exactly that sort of revolutionary thinking that Kirk Bresniker, Hewlett Packard Labs Chief Architect and HPE Fellow, and Stan Williams, Hewlett Packard Enterprise Senior Fellow, have contributed to Computer magazine’s December 2015 special issue and at the IEEE Rebooting Computing Summit.

Outstripping memory

Bresniker and Williams’ article, co-authored with Sharad Singhal, Director of Machine Applications and Software, “Adapting to Thrive in a New Economy of Memory Abundance” addresses the possibility, indeed the likelihood and need, of approaching computer processing differently than it has been addressed for the last 60 years.

Their response to the challenge of traditional processor technology improvement techniques reaching their limits is to turn to “emerging device physics,” applied towards memory instead of compute, which has the potential to collapse the memory hierarchy that constitutes a majority of current computing time and energy expenditure, as well as its security exposure.  

As the authors write, “The fundamental physical entities used to represent a bit in memory today are electrons in the gate of an SRAM transistor, in a DRAM’s trench capacitor, or in a flash cell’s floating gate.” But because a “completely new process for representing data in memory is required at the nanometer scale” we need to turn to a completely new technology. Namely, high-density, non-volatile memory (NVM). 

Among other issues, the paper confronts the question, “Can new architecture make an end run around the von Neumann bottleneck, or will it simply make the old architecture obsolete altogether?” That is, will it collapse the stutter between instruction and action in a computer, or will it replace the architecture that produces that pause in the first place?

“The answer,” Bresniker told Behind the Scenes @ Hewlett Packard Labs, “depends on what your definition of what von Neumann is.  If to you it means the colloquial ‘machine which stores instructions in memory,’ then obviously we expect that to continue.  The Memristor, Labs’ response to the NVM challenge, does enable us to directly attack the bottleneck in von Neumann’s machine as it was identified by von Neumann himself – the cost, complexity and reliability of the logically reasoned memory-to-compute ratio for the important problems of the day exceeding feasibility.  However, we are planning to transcend a limitation that von Neumann had not conceived of: the contents of memory being inextricably tied to compute and the applications (now OS and applications, since von Neumann had no need for an operating system since he had human operators).”

Adapting to thrive

How we can use NVM, including the Memristor, and other advances in device physics, to overcome the inherent limitations built into current computers is the main thrust of the paper. The paper, in turn, was the reason two of its authors were invited to present at the IEEE Rebooting Computing Summit, which took place from December 9-11 in Washington, D.C.

Williams spoke about the paper he co-authored with Eric DeBenedictis of Sandia National Laboratories, “Above and Beyond Exascale Computing: Sensible Machines.” This paper was written for an IEEE co-sponsored challenge, responding to the White House’s call to create ’Sensible Machines’ that are “Smaller, Faster, and Lower Power.”

In a talk titled, “A New Abundance: Adapting to Thrive in the Memory-Driven Computing Era,” Bresniker expounded on the topic of his and Williams’ paper in IEEE Computer.

In his presentation, Bresniker illustrated how The Machine will solve a number of problems, including the von Neumann bottleneck, and pioneer some new, or long-neglected practices, such as creating the first clean sheet OS in decades.

His talk also featured possibly the most intriguing use of a William Blake quote in a computing presentation history.

What all of this work has in common – Bresniker, Williams, and Singhal’s paper; Williams’ response to the “Sensible Machines” challenge; and both presentations; all within the context of the work going on at Hewlett Packard Labs – is a desire to unite theoretical thought and hands-on research to make an end-run around the anticipated terminal efficiency of current computing architecture.

Labs and its researchers, Bresniker and Williams, are unwilling to allow for the end of computing, even theoretically. They are compelled to engage in the kind of problem-solving that opens up new avenues of experience and exploration, making the previously impossible a new status quo, only to subsequently replace that status quo with something we can’t even imagine will be considered impossible in turn.

CLICK TO DOWNLOAD POSTER

ieeeposter3.gif

 

0 Kudos
About the Author

Curt_Hopkins

Managing Editor, Hewlett Packard Labs

Events
Nov 27 - 29
Madrid, Spain
HPE Discover 2018 Madrid
Learn about all things HPE Discover 2018 in Madrid, Spain, 27 - 29 November, 2018.
Read more
See posts for
dates/locations
HPE at 2018 Technology Events
Learn about the technology events where Hewlett Packard Enterprise will have a presence in 2018.
Read more
View all