Advancing Life & Work
KirkBresniker

Hindsight to insight to foresight – the ever-evolving power of simulation

The race to exascale computing is on and it is global. Computers capable of a billion billion operations per second won’t just advance science, they will change the scientists who use them and the disciplines they pursue. This revolution won’t stop at the hard sciences and engineering. Moving from understanding in hindsight to real-time insight, and then beyond to true foresight will leap disciplines from biology to business management. Across fields and scales, physics from the quantum to cosmological or economics from macro to micro, knowledge and those who pursue it will change.

Simulation has long been a core application of computation, in fact, prior to the advent of electronic digital computation, computation was done by analogue in analog: Select a physical system governed by the same differential or integral calculus equations you desired to solve, set it up with specific initial conditions and then directly measure a numeric result as the system evolved. But the power and depth of today’s simulations is crossing a threshold, from reductionist mirrors of isolated behaviors to systems of such fidelity and complexity that they display emergent behaviors of the most complicated biological, economic, and physical processes.

Adding debug statements to the cell

The accuracy of these simulations will continue to increase, along withthe energy efficiency –  first with the introduction of photonic communications, high bandwidth, and storage-class memories and digital floating point arithmetic accelerators at the exascale.  This will be followed by additional gains in non-conventional accelerators: application-specific digital, analog, quantum, neuromorphic, cryogenic, simulated annealing, and other novel approaches. Alongside the explosion in the diversity of computational approaches admitted to the massive simulation environment, the application of artificial intelligence will provide a two-fold benefit. First, it will greatly increase the operation efficiency and reliability of these massive data center scale simulation engines, balancing out the societal benefit of a simulation against the time and energy to achieve it.  Second, they will curate and examine the massive data avalanche that will be continuously generated by these simulations, an unblinking assistant always searching for the unanticipated result that can be the catalyst for deeper understanding.

Beyond the accuracy and scale which can yield emergent properties to match increasing complex physical or economic systems, these systems offer the advantages that simulations have always had over direct experiments or observations of “real” systems.  Simulations are repeatable, controllable and can be instrumented in a way that physical systems cannot. While they consume energy, sometimes vastly more than the system being simulated, they do not consume unique specimens or materials. Time is under exquisite control and can be rolled forward and back as desired, or completely stopped while the system is examined in a way a physical system never could be.  All of these techniques are in the commonplace toolbox of the software developer, but they are now extended to the biologist, chemist, economist, and artist.

When we couple the depth and accuracy of the emerging exascale and future augmented simulation environments with the visualization, control, and repeatability that simulations have always enjoyed, we reach the point at which the simulation no longer approximates what we already knew, it extends our knowledge, challenges us to question our understanding and our theories. As expressed beautifully by the author Antoine de Saint Exupéry, “the machine does not isolate us from the great problems of nature but plunges us more deeply into them.”  This is how simulations will change the nature of the scientific pursuit and the scientists themselves.  

Evaluating the state function of the enterprise

 It is not just the hard sciences, or the operational engineering disciplines that will benefit from embracing simulation as a primary tool of discovery, it is poised to radically reshape the enterprise as well. Today even the most modern enterprise, public or private, is run by ritual and in arrears. As the enterprise IT backbone co-evolved with the relational database what was demanded in terms of quality was known by the acronym ACID: atomicity, consistency, isolation, and durability, all properties which engendered the trust required to move business processes forward into the computer age. Transactions were faithfully recorded in perpetuity. Even in the face of equipment failure, the audit trail would be secure. But as department after department was computerized, we found a challenge in latency and scale that left one legacy the same: the ritual of reconciliation at the close of period. Even the largest systems available were unable to provide a single platform capable of running an enterprise. A Fortune 100 global enterprise IT portfolio might consist of thousands of applications, tens of thousands of relational databases all operating asynchronously, and independently save the hundreds of thousands of data copy jobs executed every month between applications. So how does the leadership understand the state of the business?   They have dashboards and projections, but really the only way to evaluate the state function of the business is via the close of period ritual, where the business and the underlying IT systems are rendered quiescent and a snapshot of the data at rest is taken and then processed until some number of days, or weeks later.  “How did we do?” is operation-via-hindsight and in short order the past tense will cease to be competitive.

The new species of enterprise, the enterprise of the fourth industrial revolution is a real-time, data-driven, analytic enterprise, a hyper-competitive enterprise.  While conventional IT affords a vision of what that enterprise will look like, it cannot afford us the ability to realize it, at least not sustainably and equitably. While the waning of Moore’s law is the greatest challenge to conventional IT improvement, it is the greatest opportunity for non-conventional approaches which will re-ignite exponential growth in performance and satisfy our need for sustainability. In-memory, neuromorphic, analog, quantum, photonic, non-linear, and application-specific approaches are all emerging as the new ensemble of approaches. Nearest in, three-dimensional scaling of memory at every level, layers in a die, die in a stack, stacks in a package, then coupled photonically at the enclosure, rack, row, and data center scale will finally address what John von Neumann described in his seminal 1946 paper in which he outlined all modern compute architecture as the number one limiter to computation: the scalability of memory. Abundant, persistent, sustainable memory is a double-edge sword cutting through the Gordian knot of conventional enterprise IT. First, it places all data in a unified memory, every byte as accessible as every other byte, naturally aligned in time as well as address space. Now every business process, every manufacturing step, every customer engagement or employee interaction can be reflected immediately in memory. The memory write cycle now becomes the scale at which we can evaluate the state function of the enterprise. We can turn the time Vernier dial to quickly scan back and forth over decades or hone our focus to microseconds.

Second, the freedom to write directly to memory allows every type of business information to be recorded natively. Rather than being reduced to the language of punch cards, the fields and records of the relational database, in-memory systems augment transactional data with times series data, event driven records, and non-structured rich media data.

But the benefits go beyond natively representing the entire range of modern enterprise data in memory and in native form to become the ideal ad hoc query engine for the human leadership team. This longitudinal data platform is the ideal environment for artificial intelligence algorithms: anomaly detection and auto-correlations routines as well as auto-encoders can operate as unsupervised AI learning engines scanning over the entire enterprise to seek out correlations that would never have occurred to even the most sophisticated business leaders because they were never afforded the time-aligned, in-memory view across every type of business information.  But this is not only a platform for instantaneous AI insight. As time is added to the dimensions by which we observe the data which is now in fact the enterprise, we now have the ability to differentiate with respect to time, to define velocity, acceleration, gradient and curl to the system function of the enterprise. Added to the predictive power of this calculus is the application of the model of the time-varying graph representations of the enterprise. So many economic, supply chain, and ecologic systems are most naturally represented by the vertices and edges of a network graph, and the mathematics of graph theory provide not only power tools of analysis but also of prediction. But their power derives in part from the ability to represent the stochastic nature of these ever-evolving systems. Harnessing that power is fundamentally limited by the speed at which any and every byte data encoded in the graph be compared, and that is where the scaling of massive in-memory systems yields the breakthrough required.

The race to more real than real

 The race to exascale compute is on and it is global, not only global in the regions pursuing this level of capability, but also global in that there will shortly be no area of human creative or economic activity which will not be affected by it. It represents a discontinuity in competitiveness. Enterprises public and private will either adopt these techniques or be desperately trying to compete against them for results and for talent.  Yet it is still not sufficient, there is a dual discontinuity which also must be explored, the inexorable, disproportionate rise of data at the edge, data to massive to move yet to valuable not to analyze. The truly hyper-competitive real-time analytic enterprise will harness both the power of real time data analytics at the edge to fuel simulation engines and gain the foresight necessary to win in any opportunity.

KMB
About the Author

KirkBresniker

Kirk Bresniker is Chief Architect of Hewlett Packard Labs and an Hewlett Packard Enterprise Fellow. Prior to joining Labs, Kirk was Vice President and Chief Technologist in the HP Servers Global Business Unit representing 25 years of innovation leadership.