HPE Blog, Austria, Germany & Switzerland
1767188 Mitglieder
771 Online
108959 Lösungen
Neuer Artikel
Dima_Tsaripa

Air Travel to Enterprise-wide AI Adoption - HPE AI Reference Architecture Program

Kuala Lumpur airport_1600_0_72_RGB (1).jpg

Business travellers who urgently need to board their flight hate long queues at security. A fast-track ticket allows them to jump the queues via separate access. Businesses would like to have such a fast lane when it comes to developing tailor-made AI solutions for their business processes as speedily as possible. HPE's AI reference architecture considerably accelerates the journey to AI success.

1.png

 

 

 

 

 

 

 

 

 

According to an Accenture survey of 1,200 companies, almost two-thirds (63%) of companies are still at the very early stage of their AI journey: they are still exploring their destination, i.e. potential uses for artificial intelligence. 13% are in the experimental phase, 12% are already using AI in all business units, while a further 12% have already reached their destination and are working with full AI expertise.

The experimentation phase is a particularly critical stage of the journey: the company has already recognized the potential of AI for its day-to-day operations, has prepared its data accordingly, has procured some tools and has already implemented one or two use cases. Now it's time to introduce additional AI solutions wherever they can boost business success. And experience shows that this applies to practically all business divisions.

A financial institution, for example, could initially use AI algorithms for fraud detection, the standard use case in the financial industry. In its next project, it could train an AI to analyze property values for mortgage approval. Building on this, numerous other projects can be implemented, from assistance with risk management to an AI that detects potentially disgruntled customers about to cancel their contracts, identifying them early on based on their user behaviour.

2.jpg

In order to progress as quickly as possible on this path to multifaceted AI usage, a company requires two things: first, it has to expand its in-house AI expertise, either by upskilling employees or by hiring AI specialists. Second, it needs an IT infrastructure that is designed from the ground up for AI training and deployment and can grow with business needs without busting the budget.

HPE understands the needs of AI voyagers and has developed a reference architecture that covers every stage of the AI journey. The solutions within this reference architecture remove all obstacles for enterprise customers so they can reap the business benefits of AI-powered analytics and interaction at a much faster pace.

 

Offerings for All Stages of the AI Journey

21212.jpg

The "AI Starter" package provides a project team with all the hardware, software and service components it needs to navigate the unfamiliar realm of AI. In the subsequent experimentation phase, the follow-up "AI Inference" offering allows these projects to be flexibly supplemented with additional use cases. Both offerings are based on the proven HPE ProLiant servers with Nvidia graphics processors (GPUs). These GPUs are the benchmark for maximum processing power in the AI world.

The following steps of the AI journey are designed to scale AI projects across the enterprise, starting with training AI models for a wide range of use cases from all business units. HPE's reference architecture uses larger, more powerful systems for this purpose: Cray machines with AMD processors and Nvidia GPUs. In addition, the HPE Machine Learning Development Environment (MLDE) provides all the software required for efficient AI model training and usage.

Importantly, the investments from the early stages of the AI journey are not lost. The servers from the start-up and experimentation phase can later be used in conjunction with the Cray systems. This means that a company does not have to change its aircraft mid-flight. Nor does it have to travel in a full-size airliner right from the start, as long as a small, agile sports plane is the more practical means of transportation.

 

Tested for Smooth Operations

The building blocks of HPE's AI reference architecture are designed and tested to work together seamlessly. The same applies to scaling the infrastructure from one level to the next. This minimizes the risk for the company at every stage of its flight.

One company that has already reached its AI business destination is Aleph Alpha. The start-up from Heidelberg, Germany, offers generative AI that is not only able to compete with the well-known AI giants from the US, but even implements this under European privacy and data sovereignty standards. Aleph Alpha relies on HPE technology for its AI portfolio. HPE has now launched a new generation of AI offerings that performs even better than the generation used by Aleph Alpha. The new offerings are available at an introductory price until October.

Offset_162465_1600_0_72_RGB.jpg

What's best: a company planning to expand its AI activities does not necessarily have to buy its own AI aircraft. HPE's solutions are available for on-prem deployment as a service via HPE GreenLake. This means that a customer always has the right means of transportation to get to their AI destination as quickly as possible – without upfront investment and without overprovisioning. This way, HPE's reference architecture provides fast-track access to a fast-growing set of AI wins. Without time-consuming stopovers, it provides a direct flight to AI-powered business success.

For more information, simply contact me by email at dima.tsaripa@hpe.com

Über den Autor

Dima_Tsaripa

Dima Tsaripa ist Category Manager HPC, Big Data & Artificial Intelligence bei Hewlett Packard Enterprise.