Behind the scenes at Labs
cancel
Showing results for 
Search instead for 
Did you mean: 

For sale: Memory-Driven Computing

Curt_Hopkins

mdcdevkit (Custom).jpg

By Curt Hopkins, Staff Writer, Enterprise.nxt

In November of 2014, Hewlett Packard Labs announced The Machine project, out of which was born the company’s vision for a new computer architecture called Memory-Driven Computing.  In 2017, Hewlett Packard Enterprise demonstrated that the Memory-Driven Computing architecture works by unveiling a prototype called The Machine. With 160 terabytes of shared memory, it remains the biggest single-memory computer ever built.

This June, HPE CEO Antonio Neri has announced the next big step towards a Memory-Driven future: The company will soon be taking orders for Memory-Driven Computing development kits for delivery early next year. The kits will combine prototype hardware and software tools built on the company’s legendary ProLiant platform.

“This is the latest in a series of successive approximations,” says Kirk Bresniker, Chief Architect, Hewlett Packard Labs. “It's not an end state because we're never done improving.”

This dev kit does not exist in a vacuum. It’s part of the Memory-Driven Computing continuum, which can scale from a tiny IoT device all the way to exascale – supercomputers that will dwarf what exists today. 

The most basic explanation of the dev kits is that they are ProLiant servers built with Memory-Driven Computing architecture. Its modular, high-performance fabric will allow the user, as Bresniker says, to “build for purpose with precision.”

As modules become available, users will be able to plug in many different kinds of microprocessors, accelerators, and memory technologies. The Memory-Driven Computing software tools will allow customers’ advanced development teams to harness these diverse elements in novel ways because the modules can all communicate at the fastest possible speed – the speed of memory.

With such a prototype, a user can make, for instance, an AI-driven intelligent factory controller, then scale seamlessly without changing a line of code. They will be able to make a smart grid controller for millions of smart edge devices or synthesize live and historical medical imaging into a haptic AR experience.

“We've always said that Memory-Driven Computing is going to go from embedded to exascale,” says Bresniker. “We started out at the rack, and we have now worked from there to aisle and to data center. Now we're starting to drive the technology in the other direction, to drive out instead of up. Out towards the edge, out towards where all that data is going to live.”

Featured article


Photo by Rebecca Lewington

0 Kudos
About the Author

Curt_Hopkins

Managing Editor, Hewlett Packard Labs

Comments
Eduardo Vega

Hello, I just wanted to bring to your attention (what I think are) a couple of typos in this post:

  1. it remains the biggest single-memory computer every built.
  2. we've now have worked from there to aisle and to data center.

Hopefully this helps. 

Eduardo