Servers & Systems: The Right Compute
1752609 Members
4338 Online
108788 Solutions
New Article ๎ฅ‚
TechExperts

Enable rapid insights and real-time data analytics pipelines from edge to hybrid cloud

Learn how an enterprise-grade, optimized end-to-end data analytics pipeline solution can help you derive value from your data from edge to hybrid cloud.

IoT. Big Data. Analytics. AI. These are loaded terms which mean different things to different IT departments. As a solutions engineering organization here at HPE, our team regularly meets with customers on these topics. Itโ€™s become evident that these terms are underpinned by a single concept: the data pipeline. Data is growing in all dimensions, including its importance to the business, and building an end-to-end data analytics pipeline is the only way to connect the dots and derive value from all that data. 

Thid figure below highlights the key elements of a bi-directional data pipeline, spanning from the edge to the cloud.From Edge to Hybrid Cloud - Fig 1.pngIn simple terms, data is generated from devices and sensors that sit on the edge.  Depending on the use-case, inference is then either run on that data at the edge or the data is streamed into your "cloud" where real-time analysis can be performed (with the term cloud being used here a representation of operating model and not location, encompassing public, private, and increasingly hybrid cloud deployments). After the data is analyzed, that data is routed into different data repositories for additional batch or interactive analytics, transformed into case sets used for model building for AI training, or stored in longer term archival. As the data is used to build newer models, these models are then pushed back out towards the edge to where the analytics on the data is occurring. 

Unfortunately, as most of you are aware, this is not easy. Speed, scale, data sprawl, security, and framework complexity are just a few of the challenges. There are many tools and different open-source projects that comprise each stage along this data pipeline diagramโ€”and your data scientists, data engineers and analysts will have strong opinions on the correct tool(s) to execute a particular job. These teams need to be able to deploy the tools they need, when they need them, where they need them to run, at production scale, and with high performance. This is the job IT organizations face today, and this is precisely where HPE can help.

HPE has workload-optimized platforms to serve diverse needs across the data pipeline, from inference on the edge to deep learning training in the data center.

Purpose-built server, storage and networking hardware is the foundation to build an infrastructure providing both rapid deployment and scale, and to deliver the highest levels of performance, quality and availability.  

Building on these workload-optimized platforms, HPE has partnered with Red Hat to provide an enterprise class solution for both platform and containers-as-a-Service (PaaS or CaaS), using an optimized end-to-end data analytics pipeline with Red Hat OpenShift Container Platform (OCP). Container technology, and Kubernetes specifically, are quickly becoming the enabling abstraction architecture across the data pipeline in hybrid cloud deployments.  Containers allow workloads to be rapidly deployed, scaled up or down, and provide flexibility to move between on-premises, public cloud and the edge.  Combining Red Hat's OCP distribution with platforms like HPE Synergy and HPE 3PAR and HPE Nimble Storage, HPE delivers a turn-key solution for a composable architecture to rapidly deploy containers supporting new application frameworks, resulting in faster insights for the business.  A range of service offerings from HPE Pointnext and HPE GreenLake let you decide whether to purchase upfront (CapEx), or move to a pay-as-you-go consumption model (OpEX). 

HPEโ€™s CaaS platform provides the foundation to securely build a data analytics pipeline using the tools you need, resulting in an environment where you can spool up new container resources to provide frameworks for model training, data ingestion and data flow in a secure manner. For example, SAP Data Hub can be provisioned using Red Hat OCP and help connect the disparate data sources in your environment while maintaining data lineage and data governance. 

SAP Data Hub is a data orchestration solution that helps you discover, transform, enrich and analyze different types of data across your environment. SAP Data Hub lets you create visual data flow pipelines that connect streams of dataโ€”from data flowing from your edge devices to data that resides in HANA or sits in your Hadoop or S3 object data stores. You can build data science models or analytics queries for your data scientists or business analysts with a complete picture of your organization, securely. These are the tools your business needs to move fast. 

SAP Data Hub is just one example of where HPE and Red Hat OCP can provide a platform for you to continue to innovate and build your next generation data pipelines. As more and more frameworks and software stacks move to support Kubernetes, having an enterprise ready CaaS architecture will allow you to build your edge to cloud data pipelines and connect your data sources to derive the most value out of your data.

Learn more from our Reference Architectures. And please reach out to your HPE representative if you would like to discuss this further.


Meet IJoe-Sullivan-HPE.jpgnfrastructure Insights blogger Joe Sullivan, Data & Analytics Solutions Engineering Manager, HPE. Joe is a senior technical manager leading a global team of engineers developing complex integrated solutions in the database and analytics space. Joe is a frequent speaker at industry events and previously worked as a Master technologist in the solutions organization, helping define and drive strategic solution opportunities in the Microsoft technology space.Infrastructure Insights


Insights Experts
Hewlett Packard Enterprise

twitter.com/HPE_Servers
linkedin.com/company/hewlett-packard-enterprise
hpe.com/servers

 

 

 

About the Author

TechExperts

Our team of HPE and other technology experts shares insights about relevant topics related to artificial intelligence, data analytics, IoT, and telco.