Behind the scenes at Labs
Showing results for 
Search instead for 
Did you mean: 

An Oral History of The Machine—Chapter Four: Open Source



Chapter Four: Open Source

 By Curt Hopkins, Managing Editor, Hewlett Packard Labs

The Machine is a computing architecture so radically different than any which has come before that it will affect everything we do in the future. Hewlett Packard Labs has spent the last five years developing the memory-driven computing, photonics, and fabric that has gone into The Machine and which have made the impossible inevitable.

We spoke to several dozen researchers – programmers, architects, open source advocates, optical scientists, and others – to construct a nine-part oral history of the years-long process of innovating the most fundamental change in computing in 70 years.

These men and women are not only scientists, they are also compelling story tellers with an exciting history to relate. If you’re interested in how differently we will be gathering, storing, processing, retrieving, and applying information in the near future, or you just enjoy good stories about science and discovery, read on. 

If you would like to read other entries in the series, click here


Director, Machine Applications and Software. Involved in The Machine program since it was announced.

When customers look at a new architecture, they often fear that it will lock them and their people into an unfamiliar set of tools. If I open source the fundamental tools, however, they can see there is no lock in and their people get experience using it. By the time an open sourced technology is given to the customers, they already understand how to take advantage of it. From my point of view it’s a very rational approach.

Additionally, demand for hardware will always depend on people’s ability to program to it. If I don’t have this community, I will have failed.

The challenge we have is access. For the open source community to take advantage of The Machine, they have to have lower-level examples to see what they’re programming against. They must have something that looks enough like The Machine that they can run their applications on.

The difference between the current environments and what we use is the Superdome X, at 10-12 terabytes vs. 16 gigabytes on a laptop. That’s the challenge to supporting the open source model. An open source developer working in their garage? They don’t have that kind of computing power, so that’s a very hard proposition.

A lot of it will depend on our partners and customers, who have adequate computing power at their disposal. 


Distinguished Linux Technologist

Generally, making software free has proven to be the most efficient and stable way of developing large systems.

A lot more of the best talent in the world doesn’t work for HPE than does. So, engaging co-travelers and potential competitors and customers is the best way you can develop software.

We are working actively to get the software we develop released as open source, including the main bulk of our Linux work. We’re doing hardware enabling, so the legal complexity of getting permission is always a challenge. You have to make sure it doesn’t expose inappropriate IP, for instance.  This morning I spent an hour talking with our lawyers.

Our technical team is ready to release software as soon as the hardware is available. But even before it’s made available, people can come and touch it, they can engage it in terms of the critical Linux infrastructure. Fabric Attached Memory Emulation, one of our recently-released open source tools for The Machine, shows the structure of hardware in a fairly abstract way, for instance.

We’re working with the Linux community on how The Machine will be folded into the Linux kernel. It’s not as technically challenging as you might imagine. A new file system for the kernel is easy to get adopted and we’re getting the non-volatile memory infrastructure changes integrated. For our initial Linux release we’re only adding, not going in and changing anything yet.

We have reason to believe the open source community is going to be very interested in The Machine. We use direct map NVM in a very new way and that’s going to see a lot of uptake in the community. NVDIMMS are an active area of research in the community. Also of interest are some important changes to systems like Spark in our ability to manipulate large memory system. Some, like accelerated Apache Spark, are directly, currently, transferable on hardware you can buy today.

What I see going forward is more pushing of software we developed internally to an external development with partners and community members. The most interesting piece of software evolution will be that which goes beyond Machine data structures to making them into directly manipulable objects that sit in persistent memory.

This is going to change software in some pretty major ways. 


Director of The Machine Community

Whether it’s a phone, a watch, a server, or a laptop, we in the tech community have spent our efforts on incremental enhancements to handle emerging deficiencies in an architecture that is more than 70 years old. At every tier, we’re working around a bottleneck in the underlying architecture of the systems that power every computer on the planet.

The Machine and its focus on storage shared across nodes is a fundamental change in the pervasive architecture of computers. With the introduction of high speed universal memory shared between multiple heterogeneous processors, and the removal of tiers of memory and caches within it, the open source community is facing a huge challenge.


Everything has to be rethought.

The problems we will be able to solve with The Machine are those which are currently impossible to solve.

We need to remove the bottlenecks that are inherent in the architecture itself.  We need a system that can analyze data faster and provide broader context and persist the states of that analysis so that we do not have to spend time, money, and power to recreate analysis.

These are the classes of problems that The Machine is intended to address. These are the engines that will fuel innovation for many decades to come in the open source community.

I had a conversation not long ago with a developer who was focused on gleaning intelligence out of large volumes of medical records. The problem he outlined was that you can only look at part of the records at any given time, not because of a privacy issue, but because of limited memory in the machines.

When I told him about large scale graph inference on The Machine and our work on time varying graphs, his eyes lit up. Everything he was doing was incremental, work-arounds and hacks to get incrementally more information into view.

He couldn’t look at data over time, and yet that was necessary, as health care is about trends. Is the patient improving or is he or she getting worse? With The Machine, analytics gains the context that only comes from large amounts of memory, more informed decisions made more often. That means people get better treatment and insights from data that otherwise would have been discarded simply because there was not enough space to keep it.

This is all about people advancing innovation. And although the open source community has done an admirable job of compensating and working around limitations in the architecture, it is time for something different.


HPE Fellow and Director of Open Source Strategy, Former Linux Foundation Board Member

We have reached a point in history when customer appetites for differentiation are limited. The development community is accustomed to working with open source operating systems, and open source solutions are a large part of our market.

The set of capabilities The Machine offers are not entirely unique but they’re dramatic in their application. The open source community is excited about The Machine and we now hope to help the community to also become enthusiastic about working on applications system development and execution, especially around the very large non-volatile memory pool and photonic interconnects.  However, what lights up developer interest is hard to predict.

One challenge we’re facing is that we don’t have a form factor for our tech that a developer would want to own themselves. That makes it harder to gain traction because open source developers like to develop on what they have. Rack scale prototypes are not the same as a development board or laptop form factor. We don’t have anything like that for The Machine. It would be awesome to have a notebook full of Memristors but that won’t be available for a while.

Instead, interest in open source software on The Machine seems likely to come more from universities, other companies, and labs than from individual developers. Those are the places of potential enthusiasm for what we’re doing.

To read the other chapters in the series, click here

0 Kudos
About the Author


Managing Editor, Hewlett Packard Labs

Online Expert Days - 2020
Visit this forum and get the schedules for online Expert Days where you can talk to HPE product experts, R&D and support team members and get answers...
Read more
HPE at 2020 Technology Events
Learn about the technology events where Hewlett Packard Enterprise will have a presence in 2020
Read more
View all