- Community Home
- >
- Software
- >
- HPE Ezmeral: Uncut
- >
- Move from data deluge to business insights with th...
-
-
Forums
- Products
- Servers and Operating Systems
- Storage
- Software
- Services
- HPE GreenLake
- Company
- Events
- Webinars
- Partner Solutions and Certifications
- Local Language
- China - 简体中文
- Japan - 日本語
- Korea - 한국어
- Taiwan - 繁體中文
-
- Advancing Life & Work
- Advantage EX
- Alliances
- Around the Storage Block
- HPE Ezmeral: Uncut
- OEM Solutions
- Servers & Systems: The Right Compute
- Tech Insights
- The Cloud Experience Everywhere
- HPE Blog, Austria, Germany & Switzerland
- Blog HPE, France
- HPE Blog, Italy
- HPE Blog, Japan
- HPE Blog, Middle East
- HPE Blog, Latin America
- HPE Blog, Russia
- HPE Blog, Saudi Arabia
- HPE Blog, South Africa
- HPE Blog, UK & Ireland
- HPE Blog, Poland
-
Blogs
- Advancing Life & Work
- Advantage EX
- Alliances
- Around the Storage Block
- HPE Blog, Latin America
- HPE Blog, Middle East
- HPE Blog, Saudi Arabia
- HPE Blog, South Africa
- HPE Blog, UK & Ireland
- HPE Ezmeral: Uncut
- OEM Solutions
- Servers & Systems: The Right Compute
- Tech Insights
- The Cloud Experience Everywhere
-
Information
- Community
- Welcome
- Getting Started
- FAQ
- Ranking Overview
- Rules of Participation
- Tips and Tricks
- Resources
- Announcements
- Email us
- Feedback
- Information Libraries
- Integrated Systems
- Networking
- Servers
- Storage
- Other HPE Sites
- Support Center
- Aruba Airheads Community
- Enterprise.nxt
- HPE Dev Community
- Cloud28+ Community
- Marketplace
-
Forums
-
Forums
-
Blogs
-
Information
-
English
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Receive email notifications
- Email to a Friend
- Printer Friendly Page
- Report Inappropriate Content
Move from data deluge to business insights with the right data fabric
During a podcast with industry analyst, Dana Gardner, I discuss how edge-to-cloud data fabric helps enterprises store, manage, and access vast amounts of data for analytics.
According to a 2020 article in TechTarget, a recent IDC forecast revealed enterprises will create and capture an estimated 6.4 zettabytes of new data in 2020. This firehose of data is flooding the enterprise from a variety of sources: servers, smartphones, websites, social media networks, e-commerce platforms, and Internet of Things (IoT) devices. And the spigot is growing larger by the day.
Given those numbers, you may wonder if the typical enterprise can handle this deluge of data—and if so, can they analyse it well enough to reap its benefits? Dana Gardner, Principal Analyst at Interarbor Solutions, discussed this issue with me in a BriefingsDirect podcast, How the Journey to Modern Data Management is Paved with an Inclusive Edge-to-Cloud Data Fabric. I encourage you to check out the podcast, but for those of you who just want the highlights, continue reading. I’ve summarized below a few interesting points from our discussion.
Flooded with a never-ending flow of data
Twenty years ago, businesses were concerned about storing petabytes of data associated with their applications. Now with IoT, devices are generating zettabytes of data and sending it back to the company. And all this data needs to be stored, managed, and analyzed.
Businesses need to get a handle on where the data is generated and how it will be stored and accessed—a challenge that gets more urgent by the day. Complicating this process is the proliferation of data silos. Data is all over the place—residing between multiple cloud providers as well as on-premise locations, creating data silo sprawl.
Numerous logjams are blocking data access and management
Enterprises face a huge challenge when it comes to data access—in particular, ensuring secure data access at the edge, in a multitude of cloud, and at the core. And each cloud provider typically has their own access methodologies and software development kids (SDKs), contributing to the complexity of accessing the data.
The enterprise also needs to consider how a variety of different data types will be authorized and accessed. For example, an enterprise may have an object-based system with its own authorization and authentication techniques. Add to that an SQL database, a file-based workload, or a block-based workload—each with different access requirements. Gaining a common, secure access layer that can access different types of data is essential to eliminating data sprawl.
Part of solving the security access problem involves having a common application programming interface (API) across all data types. That’s because standardized APIs let a variety of applications with multiple data types securely talk to each other. As businesses attempt to access and manage the tsunami of unstructured data from all corners of the enterprise, APIs are helping. A variety of business and development tools come into the enterprise via an API, cutting down on access methodologies, security domains, and data management.
Having commonality in an API layer lets the enterprise deploy anywhere—providing the capability to go from the edge to dispersed data centers or the cloud. But it can also create challenges in terms of where the data lives, due to data gravity issues. And without portability of the APIs and data, enterprises will always see some lock in.
The bad news is that APIs only solve part of the problem. The enterprise has many challenges to consider when attempting end-to-end data management. The good news is that a platform and standards approach with a data fabric is the single best way to satisfy all the requirements an enterprise needs to store, manage, and analyze all data types from any source.
A bridge across troubled waters -- HPE Ezmeral Data Fabric
HPE Ezmeral Data Fabric is such a solution, providing enterprise-wide, global access to data, bridging seamlessly from on-premises to the edge, or to one or more clouds. This unified platform supports a variety of data from large to small, structured and unstructured, as well as time series and sensor data—essentially every data type from any data source. This capability is a big driver for businesses. They want a common, secure access layer that can access different types of data.
As covered previously, another factor is having a common API access layer, which helps reduce management and security costs with a distributed site for your application data needs. The HPE Ezmeral Data Fabric provides the same security domain across all deployments. That means enterprises can have one web-based UI (or one REST API call) to manage different data types and their associated security controls.
Another key benefit is that the HPE Ezmeral Data Fabric can be deployed across any x86 system, which protects you from data lock-in with a particular cloud vendor. Also, with more than 10 different API access points, Ezmeral allows for multi-data access based on the applications needs. The platform includes everything from storing data into files to storing data in blocks. It also helps run diverse computational tools and open source frameworks, without requiring multiple clusters or silos.
One of the greatest features in the platform, Global Namespace, helps reduce the time it takes someone to find the data they need for the project they are working on. For example, a lawyer preparing a case for discovery can merely double click on a data fabric drive and can see all the data globally using the same security model. Another feature is multi-temperature storage, which decreases deployment costs by allowing you to tier data off to a cheaper and deeper storage solution, all while still managing the data in one location.
These features make the HPE Ezmeral Data Fabric simple for everyone in the enterprise to use. They gain a common data fabric, common security layer, and common API layer.
The unique capabilities in the HPE Ezmeral Data Fabric also make it ideal as the persistent storage layer of the recently announced HPE Container Platform. Enterprises can enjoy full end-to-end management of their containers, and built-in enterprise grade persistent storage, resulting in management and data portability for containerized workloads.
Opening the floodgates: automating data management
The first step in discovering insights from data is being able to access and use the data. When an enterprise can improve automating data management across multiple deployments—managing it, monitoring it, keeping it secure—then they’ve opened the floodgates. Software developers and data scientists can now focus on actual use cases.
A detailed report published by IDC demonstrates how impactful this process can be. Analysts interviewed long-time users of HPE Ezmeral Data Fabric (formerly called MapR) to discover the impact of this technology on business outcomes. The organizations interviewed are substantial businesses across a range of sectors with multi-billion-dollar revenues. The HPE Ezmeral Data Fabric, had a substantial positive impact on several aspects of large-scale data usage and related business processes. The report says users achieved an estimated 567% five-year ROI with an eleven-month payback period.
To listen to the complete podcast, follow this link: How the Journey to Modern Data Management is Paved with an Inclusive Edge-to-Cloud Data Fabric.
About the author:
Chad Smykay, Field CTO, HPE Ezmeral Data Fabric
Chad Smykay has extensive background in operations with his time at USAA as well as helping to build many shared services solutions at Rackspace, a world class support organization. He has helped implement many production big data/data lake solutions. As an earlier adopter of Kubernetes in the application space coupled with data analytics use cases, he brings a breadth of background in the application modernization space for business use cases.
Hewlett Packard Enterprise
twitter.com/HPE_Ezmeral
linkedin.com/showcase/hpe-ezmeral
hpe.com/HPE_Ezmeral
- Back to Blog
- Newer Article
- Older Article
- SFERRY on: What is machine learning?
- MTiempos on: HPE Ezmeral Container Platform is now HPE Ezmeral ...
- Alexander6 on: Three advantages of a data fabric for retailers
- Arda Acar on: Analytic model deployment too slow? Accelerate dat...
- Jeroen_Kleen on: Introducing HPE Ezmeral Container Platform 5.1
- LWhitehouse on: Catch the next wave of HPE Discover Virtual Experi...
- jnewtonhp on: Bringing Trusted Computing to the Cloud
- Marty Poniatowski on: Leverage containers to maintain business continuit...
- Data Science training in hyderabad on: How to accelerate model training and improve data ...
- vanphongpham1 on: More enterprises are using containers; here’s why.
Hewlett Packard Enterprise International
- Communities
- HPE Blogs and Forum
© Copyright 2022 Hewlett Packard Enterprise Development LP