Around the Storage Block
Showing results for 
Search instead for 
Did you mean: 

Experience the data pipeline at HPE Discover


ATSB_Big Data_blog.jpg

Many businesses today are experiencing the influx of data from the edge – generated by devices, sensors and machines. Some of this data is expected and planned for, other times, it is more than can be handled.

Dealing with data is a consistent challenge and in this new world, from a data perspective, there are two primary challenges that customers are facing today, with regards to rationalizing and optimizing their time to value:

  1. Hybrid IT – how core infrastructure and core business applications will function in a hybrid IT world where they are being driven by increased times around deployment and increased levels of agility spinning up new applications to respond to the velocity and the variety of data types that are merging from the intelligent edge
  2. Intelligent Edge – how to deal with a wealth of devices and the data that they generate and send (including the size of the amount of data and the different data types)

When, where and how we analyze data is also changing. Analytics is a key area of focus for businesses - being able to analyze this new type of data, with the new speed of data being generated at the intelligent edge and do so in a hybrid world. It starts at the intelligent edge with data being generated from a multitude of sensors and devices. Sometimes that data is collected and analyzed at the device level itself, sometimes there’s an aggregation point and that aggregation point might be a car, a remote site (such as a hospital with patient medical sensors), or simply the PC of a data scientist.

Some form of that data is often sent back to the core. If it’s coming from multiple streams, multiple devices, it needs to be analyzed in real time. This type of analytics in the core can also be happening in the cloud by customers who are looking for special purpose type functionality such as spinning up GPU-focused test beds to do machine learning that are short term projects, as well as longer term storage and creating a tiered storage environment.

All of this creates the data pipeline.

The data flow from edge to core to cloud needs routing – through a data pipeline – which provides an infrastructure for data to not just flow bi-directionally but to also allows for the implementation of analytic processes in real-time, near-real time, and at rest, as well as AI modeling.pipeline_graphic.JPG

HPE is uniquely positioned to help customers address their challenges of building a data pipeline from the edge to core to cloud.

Next week at HPE Discover, we’ll be showing how we provide a combination of products in tested solution architectures that are purpose-built or workload optimized, from the intelligent edge to the core, and a combination of services both from a support perspective and from a perspective of professional advisory services. It’s the products as well as the expertise where we can help customers build these richer data pipelines and accelerate their outcomes based on the next generation of analytics.

At the heart of the analytic infrastructure is the HPE Elastic Platform for Big Data Analytics (or EPA for short), built primarily on the Apollo platform family, which takes innovation we have around workload optimized nodes and disaggregates storage and compute within the cluster. EPA_graphic.JPG

 For more info, Here’s @patrick_osborne with an overview on the Apollo family and EPA architecture: 

What to look for at HPE Discover Las Vegas (June 18-21)

On the show floor: Visit our demos on the HPE Elastic Platform for Analytics infrastructure and demo of POC testing we’re doing in the Connected Car space – Demo #715 and #716

Sessions to attend: Big Data and AI sessions not to miss

  • B5013, Tuesday 9:00am - Big Data from Edge to Core to Cloud with Patrick Osborne
  • DF5153, Tuesday 12:00pm - Edge to Core Infrastructure for the Big Data Pipeline
  • B5043, Tuesday 4:30pm - Big Data in the Enterprise - HPE IT transformation
  • SL4923, Wednesday 10:30am - HPE Storage Spotlight with Milan Shetti
  • B5085, Wednesday 10:30am - Harness Big Data at scale with HPE Pointnext
  • DF5205, Wednesday 11:00am - Deploying Fast Data Analytics
  • Meet The Experts theater, Wednesday 11:00am - HPE Elastic Platform building blocks
  • DF5220, Wednesday, 3:00pm - Edge to Core Infrastructure for the Big Data Pipeline
  • SL5158, Thursday 9:00am - 50 things you can do with AI - Spotlight

Get the latest HPE Discover info by downloading the HPE Discover app

For more info on HPE Platform for Big Data Analytics, visit

About the Author


I have been working across the HPE Storage portfolio for over 10 years and am based in Colorado...Go Broncos!!

See posts for
HPE at 2018 Technology Events
Learn about the technology events where Hewlett Packard Enterprise will have a presence in 2018.
Read more
See posts for dates/locations
Reimagine 2018
Join us at one of the Reimagine 2018 stops and see how we Simplify Hybrid IT, innovate at the Intelligent Edge and bring it all together with HPE Poin...
Read more
View all