- Community Home
- >
- Servers and Operating Systems
- >
- Servers & Systems: The Right Compute
- >
- AI + Data + Speed: Crucial for high-performance ra...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Receive email notifications
- Printer Friendly Page
- Report Inappropriate Content
AI + Data + Speed: Crucial for high-performance racing, manufacturing, and future retail
What is the similarity between Formula 1 racing and a manufacturing floor? While not much may come immediately to mind, two similarities are certain – speed and data enable success.
Multiple discussions started after the 2023 Austrian Grand Prix when drivers found that their lap times were being deleted after exceeding track limits.[i] You may be thinking “Isn’t that just the norm? What is the big deal?” But when race stewards had to review over 1,200 infringements by watching videotaped replays, drivers did not know their final position until 5 hours after crossing the finish line! In the competitive world of high-performance racing that was surely a BIG deal.
Now, think about how the same situation could occur in other use cases, such as manufacturing. Imagine, if you will, an automated assembly line for a beer-bottling factory where there is an issue with calibration that causes the labels to align incorrectly or cause other damage to the bottles. Wouldn’t it be helpful to be alerted immediately when such issues first occur? Most definitely. An alarm could help reduce losses associated with wastage, or the high cost of re-manufacturing.
While humans could do this task – similar to the Formula 1 stewards who were charged with reviewing replays – modern methods push the boundaries in the quest for higher productivity. That then creates the question: how can this happen at speed, through automation, and with quality control?
The manager of loss prevention[ii] for a national chain store notices increased losses over the past quarter. While staff are ever-vigilant, the best use of their time is spent helping customers spend money on goods – not watching for sneaky shoplifters. So – how can the loss prevention manager automate security to minimize the impact on customer service staff? What if there was a way they could identify potential shoplifting activity so that store managers would receive alerts in real-time? You can see how this would help with decreasing shrinkage in the future. And with advancements in technology, there are now profitable solutions for organizations, regardless of industry, to help solve the challenges highlighted across our three examples.
What do all three of these use cases have in common?
The scenarios I have presented all need an automated solution that facilitates the collection of raw data and then transforms that data so it can be analyzed in real-time. This access to relevant data helps managers make impactful business decisions, as noted in Computer Vision AI (artificial intelligence) at the Edge. By integrating intelligent video analytics with decision-making, you can accomplish real-time inferencing at the edge, and immediately identify where process break-downs are happening – and what action(s) should be taken, such as:
- Real-time monitoring and report of race cars exceeding track limits
- Supply chains that can identify and address faulty materials without having to stop production, thereby reducing returns through high quality output
- Theft detection, both internal and external, with automated evidence capture for immediate intervention and legal governance.
For computer vision AI – or for any AI initiative to be successful – it all starts with good data. Excellent decision-making requires both the data needed to create training models, and the data needed for fine tuning or re-training models for accuracy.
What is the best way to build a seamless, end-to-end data pipeline?
Data collection and transformation are core operations at the heart of any organization. Let’s dig a bit deeper to find out why an end-to-end data pipeline can not only streamline IT but also accelerate your AI outcomes.
In the diagram below, you will see a simple demonstration of a cloud-native, end-to-end data pipeline framework that you can leverage to deploy computer vision AI.
This framework will help to ensure data security, offers flexibility to scale up and down, enables seamless data access, and is ideal for your hybrid deployments. It is a modular, scalable, and automated data pipeline that allows you to use your choice of technology stacks to manage data ingestion, processing, analytics, and visualization. The framework is backed by a validated reference architecture with container platforms and HPE Ezmeral Data Fabric – to give you the confidence to speed deployment in your own environment.
We are living in a world where a massive amount of data is being generated every minute. Time-to-decision is more critical than ever. By combining computer vision AI with the HPE Intelligent Data Pipeline, organizations will be able to address many challenges they face today – and into the future.
With real-time analytics, data persistence for future model-training, edge-to-cloud deployment, and scalability – it is a future-proof solution that will grow with the business.
Next generation compute to power your artificial intelligence initiatives
To learn more about how to accelerate your AI outcomes with HPE ProLiant Gen11 AI inferencing solutions, please visit hpe.com/proliant/AI .
Meet HPE Blogger Gary Wang!
Gary is a Product Manager on the Computer Workload Solutions team responsible for Cloud Native solutions including containerized data pipeline and container platforms with various software stack. He has been with HPE for 9 years and was previously managing products in Server Options team.
Connect with Gary on LinkedIn!
Compute Experts
Hewlett Packard Enterprise
twitter.com/hpe_compute
linkedin.com/showcase/hpe-servers-and-systems/
hpe.com/servers
[i] Track limits mark the bounds of the racing surface.
[ii] The retail industry often refers to shop-lifting or damaged merchandise as "shrinkage".
- Back to Blog
- Newer Article
- Older Article
- Dale Brown on: Going beyond large language models with smart appl...
- alimohammadi on: How to choose the right HPE ProLiant Gen11 AMD ser...
- Jams_C_Servers on: If you’re not using Compute Ops Management yet, yo...
- AmitSharmaAPJ on: HPE servers and AMD EPYC™ 9004X CPUs accelerate te...
- AmandaC1 on: HPE Superdome Flex family earns highest availabili...
- ComputeExperts on: New release: What you need to know about HPE OneVi...
- JimLoi on: 5 things to consider before moving mission-critica...
- Jim Loiacono on: Confused with RISE with SAP S/4HANA options? Let m...
- kambizhakimi23 on: HPE extends supply chain security by adding AMD EP...
- pavement on: Tech Tip: Why you really don’t need VLANs and why ...
-
COMPOSABLE
77 -
CORE AND EDGE COMPUTE
146 -
CORE COMPUTE
131 -
HPC & SUPERCOMPUTING
132 -
Mission Critical
86 -
SMB
169