- Integrated Systems
- About Us
- Integrated Systems
- About Us
An Oral History of The Machine—Chapter Two: Early Days
Chapter Two: Early Days
By Curt Hopkins, Managing Editor, Hewlett Packard Labs
The Machine is a computing architecture so radically different than any which has come before that it will affect everything we do in the future. Hewlett Packard Labs has spent the last five years developing the memory-driven computing, photonics, and fabric that has gone into The Machine and which have made the impossible inevitable.
We spoke to several dozen researchers – programmers, architects, open source advocates, optical scientists, and others – to construct a nine-part oral history of the years-long process of innovating the most fundamental change in computing in 70 years.
These men and women are not only scientists, they are also compelling story tellers with an exciting history to relate. If you’re interested in how differently we will be gathering, storing, processing, retrieving, and applying information in the near future, or you just enjoy good stories about science and discovery, read on.
If you would like to read other entries in the series, click here.
HPE Fellow, Chief Architect for The Machine. With a background in BCS, he was appointed to a guiding position in the new Labs by Fink.
In November of 2013, Martin called me. Actually it was his assistant. She said “This is Beth calling for Martin Fink.” It was all rather formal, especially considering I had reported to Martin as his Chief Technologist for four years and he was notorious for calling anyone and everyone in his organizations directly to get the latest update.
When he got on the line he asked “What would it take for you to come to Labs and drive the Machine prototype?” All it took was that question.
Now I really liked my job as one of the two Chief Technologists for the Servers Global Business unit. I loved working with the field teams and with our customers. I loved helping deliver every kind of computing from tiny microservers to Superdome platforms. But I also saw the limits of what I could do to advance the Unbound ideas that I had been thinking about for the last ten years from within the business unit.
At the same time the same kind of nagging concerns that led us to proposing Unbound and hitting the reset button on the BCS product line were rising. For me the signs were all there. Moore’s Law, Dennard Scaling, Relational Database limitations, single system Operating Systems, air-cooling, electrical signaling.
One way to view technology maturity is with Clayton Christensen S-Curves. Technology starts immature, and usually worse than conventional solutions until you hit a knee in the curve and then they dramatically surge past conventional. But there is the top of the S-Curve, where you continue to invest for rapidly decreasing returns and it looked to me like it could happen to almost every technology I had ever used.
I had come down to Labs in 2010 and presented them a challenge: “Is HP Ready for Simultaneous Regime Change?” by which I meant, was HP ready for the industry shakeup I saw coming? I think I left them in shock, but as much as they might have wanted to help, the problem was that Labs couldn’t do it on their own, and neither could the business units.
But Martin wasn’t asking me to come to Labs as a one more researcher, he wanted me to drive an advanced development effort that spanned the gap between Labs Researcher and Business Unit Engineer. My job was to deliver a working prototype that demonstrated an interesting problem for the new compute. I needed to prove the existence of a new S-Curve, not just theoretically, but by construction and he wanted it by 2016. I didn’t, however, need to make it into a product for sale at the same time.
By February of the next year I was settling in to the new role and the pieces started to gel. For memory, we had all the work from Stan Williams’ team. For fabrics, we had ideas that had come from Mike Krause, a Fellow in our Servers global business unit. We added the photonics expertise of Al Davis from the University of Utah. We had ranges of potential microprocessor System-on-a-Chip designs and, with help from Enterprise Group Chief Scientist Greg Astfalk, we selected one. I drew up a picture of the architecture we wanted to build and we started making the connections between engineers and researchers.
A big milestone came in April, at Tech Con 2014. Thousands of submissions were whittled down to forty posters and forty papers. Back in 2005 I had first brought my Unbound poster to Tech Con and now nine years later, The Machine was the buzz of the conference. We brought our ideas and our work in progress and Tech Con lived up to its reputation of making good ideas great by bringing the brightest minds of HP together for three days and not letting them sleep.
But the biggest milestone was to come eight weeks later. Meg Whitman had invited, insisted may be a better word, that Martin go public with The Machine at our annual customer event, HP Discover in Las Vegas in June, 2014. It took a while for the shock to wear off, but then we got ready to go and we knew that what happened in Vegas was definitely not going to stay in Vegas.
Director of the Foundational Technologies Lab. Started at Labs working on Data Centers.
I grew up in this business working on PA-RISC and Itanium. In 2001 I came to Labs to work on data centers, Linux, sustainable energy, and helping on tech transfer out of Labs. At that time, all of these things were driving a set of conversations on how we had to deliver value to the Enterprise Group platform.
At one point, I got an email from Kirk on the topic of simultaneous regime change. He asked, what we would do if we wanted to re-architect? Seven years later, Martin came to Labs, and I saw an opportunity to create a next generation architecture that would scale for the next decade. I asked to direct the Systems Research lab.
Under Martin, we grew into a community very quickly. The challenge was that we had been in a world for two decades that had come to presume what each of the silos in the system could and would do. It was top-down. Common wisdom said co-design in this world doesn’t work. So it was very important to get all these folks together in one space. We opened up conversations across the company concurrently.
One thing we did at the Systems Research Lab was to institute semi-open Friday lunches. It brought in a full cross-section of the company with some of the smartest people in the world. You were not allowed to sit with your clique and if you didn’t come, you were wasting a perfectly good career opportunity that probably didn’t exist anywhere else in the world. It was pushing people into forming a critical mass and then letting them run.
We had a sense of what to do, but how? I have all these smart people. Now all I need is for them to start. Letting them go wild gave us things we never would have had. Photonics, non-volatile memory programming, resiliency. These were the kinds of solutions to these problems we weren’t even thinking about.
These lunches became the marketplace of ideas for things we could bring to conferences and pass to evangelists.
At the last two Discover conferences you could see the relationships developed at those lunches across the breadth of technologies we showcased. They drove Ruth Bergman’s analytics team, Bill Horne’s security analytics team, Memristor from Stan Williams’ team, photonics from Cullen Bash and Ray Beausoleil’s teams, and Sarah Silverthorn and her colleagues building the prototype.
Big organizations tend to hide all the people who are global systemic thinkers, but the last few years of this transition required we had to tease them out of hiding. These people were popping up all over the place and not in the usual cast of characters. Some are young and some are the more senior people.
Among the anchors we kept - which had their origins in these lunches and this open inquiry method - are non-volatile memory and using a photonics fabric to connect it so you could break the electrical tyranny and connect all of the system’s nodes at the same time. And it was not one of the first thoughts we had.
Director, Memristor Program. Started at Hewlett Packard when Hewlett and Packard were still around.
I ran into Kirk Bresniker in the hallway one day and I told him, “Kirk I’m ready for a change.” I heard he had come to Labs to work on The Machine. “Is there any way I could contribute?” He said, “Boy have I got the perfect opportunity for you.” And he was right. It was the capstone that brought all the learning of my entire career together.
For those of us who’ve been here for a long time, this way of researching is a return to how we used to operate as a company, for instance when we created the inkjet printer. For those who were here when Bill Hewlett and Dave Packard were around, this feels like the way the company was designed – to innovate. In a sense, The Machine is a continuation of, or a return to, their vision.
There was a moment in the history of this company when we experienced a steep drop in innovation, but this internal entrepreneurship had a role in convincing engineers, business unit managers, and marketing managers that innovation could have a positive effect on their success today.
To build the ecosystem you have to build the enthusiasm. That’s a big part of what I do for a living. And this enthusiasm, it’s not just for the young. As I told my wife one day, “I’m the only bald guy here and I’m as excited as any of these folks!”
This process, this company, has in my mind captured the essence of American ingenuity and is at the heart of the Silicon Valley culture.
Senior Director, Systems Architecture Lab. Worked on precursor technologies that led to The Machine.
In the early days, Labs operated as a series of disconnected projects run out of independent laboratories. They were all moving in the same direction, working together, and talking to each other, but there was no funding to build anything big, just for individual projects.
We were already working on the elements for a next generation computer architecture but we weren’t planning to take it to prototype. Research is different from development and if a business unit wanted to productize it, it was up to them.
He made a big bet on The Machine, coalescing I would say well over the majority of Labs resources into that project.
Labs had limited hardware design expertise; it mostly was research scientists, and they don’t do design work. Martin pulled advanced development resources from the business units, pulling them under a common funding umbrella.
My previous research was on energy and sustainability and some of what I’ve done has made its way into The Machine. In the Systems Design Group under John Sontag I had responsibility for systems architecture, for which we pulled together photonics, thermal research, systems architecture, and low level software development.
We immediately got a win with product groups due to the applicability of photonics to them. Photonics can reduce the distance, increase bandwidth, and overcome electron slowdown. The effort to make a prototype has already set us on the path to productization.
To read the other chapters in the series, click here.
- on: From Research to Reality: Exascale computing in th...
- on: Research to Reality: Obsoleting the electron (PODC...
- on: The era of real-time everything
- on: For sale: Memory-Driven Computing
- on: Labs distinguished technologist talks about the fu...
- pernikahan on: (VIDEO) From the Lab: Novel accelerators for the f...
- on: Labs intern Elizabeth Liri wins Best in Class for ...
- Campbellja on: (PHOTO ESSAY) The cook in her kitchen: A photograp...
- on: Stan Williams: a retrospective
- on: HPE DISCOVER: Demos are the best way to lay your h...