Behind the scenes @ Labs
cancel
Showing results for 
Search instead for 
Did you mean: 

The era of real-time everything

Curt_Hopkins

tomasz-frankowski-198764-unsplashmdccloudlessvision (Custom).jpg

By Curt Hopkins, Staff Writer, Enterprise.nxt

In the early days of the internet, companies like AOL offered a walled garden of connectivity and content. Soon enough, the internet as a radically open structure began to knock holes in that wall and the web raised it to the ground.

Somehow, computing companies have looked back to the era of siloed data and services as a golden age and have kept attempting to rebuild those ruins, with fluctuating levels of effectiveness. But once again, the tendency toward openness has reasserted itself on a large scale.

“We have a duty to do something radical,” said Hewlett Packard Enterprise CEO Antonio Neri. “Radical in terms of computer architecture, radical in how we build and distribute apps, radical in how we consume IT, and radical in how we protect ourselves.”

As Moore’s Law slows to a crawl, tech companies around the world are seeking solutions that will allow computing to continue us to increase the speed of our computing while minimizing the amount of energy we spend on it. Possible solutions include quantum computing, chaos computing, non-volatile computing, and, in the case of HPE, Cloudless Computing, photonics, accelerators for focused tasks, and Memory-Driven Computing.

Kill sprawl

But what’s the real solution in terms of an overall architecture? Flattening the internal hierarchy of the computer and taking the walled garden to ground once and for all, especially as regards data.

“I know for a fact that we spend more moving data around than we do actually using that data,” says Neri. What’s the solution? “Kill sprawl.”

According to Dave Carlisle, HPE Fellow and HPE’s Global IT CTO, a root problem is the disconnect between the sheer volume of data an enterprise user has to deal with, and some of the foundational technology at enterprises’ disposal today. Perhaps somewhat counterintuitively much of this begins with foundational hardware and infrastructure aspects, not software.

“We have a fundamental kind of paradigm that a given core, a given unit to compute if you will, does not have guaranteed fast access to an entire enterprise’s data set,” says Carlisle. “That breeds a whole bunch of interesting architectural implications. Not just in hardware, but software as well and how we manage our data.”

In essence, we have to cobble together the technology of the past to process the data of the future. Carlisle suggests a blueprint for moving beyond the limitations inherent in this stratospheric jerry-rigging.

“Now we start to think about a world where open, universal interconnect standards like Gen-Z are enabled and we where we can actually scale, and where fast-access persistent memory is at the center of the architecture, not the compute,” says Carlisle. “Most importantly this is a world we can scale the heterogeneous compute tier, attach it to the fabric, giving everyone access to the same fast-access persistent memory pool. Then we can start to rethink architectures.”

A funny thing happens when you reach a certain scale. Today, HPE can provide a Superdome Flex with up to 48TB of memory. These systems have opened doors that were previously firmly locked, achieving important results in Alzheimer’s research, in genomics, in cybersecurity and in financial modelling. But that’s just a drop in the bucket.

If you consider problems in terms of petabytes – enough to hold the entire working state of an enterprise – paradoxically, everything gets simpler. You can take out complexity and kill sprawl. The world spends more moving data around than actually using that data. That cost has zero value.

Imagine a world where many thousands of diverse compute cores can access any byte across tens of petabytes of ultra-fast memory virtually instantly.  Now enterprise-scale architectures and capabilities can look very different. With this level of modularity, instead of tens of thousands of databases and data islands, instead of so much time and resources spend moving and replicating data, they might just have a handful of enterprise data services and platforms with lighting fast serverless and cloud-native app components on top.

“I can make the argument that some of the most popular data processing frameworks today have a dead-man-walking element to them, because the very compute paradigms that they're architected around are being reversed,” says Carlisle. “It’s hardware that is about to drive disruption in software.”

The here and now

This set-up is not just a map for an eventual future, according to Carlisle. You can start changing the way you work today.

“You can start to shift how you build your apps in order to move faster,” he says. “We start doing far more containerized stuff on top of Memory-Driven Computing systems running a Memory-Driven Computing data store.”

This approach will allow users to engage in strategic analytics and produce a different level of understanding of their customer sets and markets, creating unexpected competitor differentiations.

Today we live in the past, working on data sometimes months old. With this approach, that changes, and we enter an era of real-time, big data analytics. Users can realize a real-time supply chain and real-time financials, closing the books every second, not every quarter.

It’s an era of real-time everything.

“This will change the game from the legacy slow IT world and latent data to high-end, very fast user-centric analytics,” says Carlisle.

Featured article


Photo by 
Tomasz Frankowski on Unsplash

0 Kudos
About the Author

Curt_Hopkins

Managing Editor, Hewlett Packard Labs

Comments
Fixhere

Nice post. I really like to read it