- Community Home
- >
- HPE Networking
- >
- Networking
- >
- Let’s welcome GenAI’s arrival in HPE Aruba Network...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Receive email notifications
- Printer Friendly Page
- Report Inappropriate Content
Let’s welcome GenAI’s arrival in HPE Aruba Networking Central
This post coauthored with Alan Ni
Organizations around the world are embracing AI faster than expected as a force multiplier for improved ITOps productivity. To support this change in networking, GenAI LLM technology will soon appear in
HPE Aruba Networking Central's AI Search feature as part of our existing AIOps suite of capabilities. This post highlights how cutting-edge GenAI techniques will enhance the accuracy and response of search and navigation, along with additional details on how our LLMs are responsibly implemented and differentiated from earlier GenAI implementations within the networking space.
The use of multiple LLMs within Central allows us to advance its conversational and summarization capabilities faster, more accurately, and more securely than ever before resulting in an even stronger search experience. Best of all, the roll out of these new production-grade capabilities started earlier this month, with an anticipated completion across our global footprint by April.
See GenAI in action
AI Search LLM enhancements
Over the past 2 years, our AI Search tool has been universally found at the top of the Central GUI, designed for users to easily find answers to questions about their environments leveraging advanced natural language processing technology.
HPE Aruba Networking Central AI Search
With the incorporation of multiple HPE trained and tuned LLMs, we are performing an “engine-swap” for AI Search. You'll get the latest and greatest in search engine accuracy, response times, and data privacy, with no change to the look and feel of interacting with AI Search.
Improving search accuracy with user intent: We’re utilizing proprietary trained and tuned LLM transformers to better understand the intent of questions entered into AI Search. Accurately understanding the intent of a user’s question is paramount for better responses and improved user satisfaction. Since its introduction, AI Search has been asked over 3 million questions, and we have trained our LLMs on this extensive base dataset. (Read more about the importance of AI training and data lakes from our fellow colleague and AIOps lead Jose Tellado.) As a result, AI Search understands and answers network jargon questions better, provides type-ahead autocomplete capabilities, and introduces search-driven navigation to other parts of the GUI directly from the AI Search interface.
Document summarization: TL; DR (‘too long; didn’t read’) our 20,000+ pages of technical publications on our products? Don’t worry, we’ll forgive you for that and have you covered! One of the most common question types that AI Search receives are questions regarding “how to” configure or activate certain functions within our networking products. AI Search’s GenAI functionality now generates human-like, summarized answers for many of those queries, in addition to providing links to the foundational documents its generative output is created from. This can be a significant time saver for network operators trying to find a documentation answer they’re looking for.
Response times: Anyone that has used ChatGPT, Gemini, or Copilot will understand that each query has a trade-off. That trade-off can be the multiple seconds ChatGPT takes to respond to your answer, the “contribution” of your question data to Gemini’s data lake for future learning, or the large amount of compute needed to continually train Copilot’s models. We’ve designed and are leveraging multiple purpose-built LLM transformers to reduce or eliminate these trade-offs for our users. Having our LLMs self-contained allows us to provide faster response times and greater search performance.
Data privacy comes first
Security-first and data privacy principles are core to what we do. They are also fundamental to good AI. Use of tools like ChatGPT have created grave concerns regarding privacy and security with many enterprises, and rightfully so. Any corporate intellectual property entered into these tools creates significant privacy and ownership issues. Our engineering teams thought very deliberately about this issue and designed a solution that takes advantage of GenAI advancements without violating our security-first principles. With HPE Aruba Networking Central, we have implemented multiple locally trained and hosted LLMs to take advantage of the human understanding and generative qualities of GenAI without the risk of data leaks via external API queries to and from our data lake. Specifically, we have a dedicated language model that identifies PII/CII (personal and corporate identifiable information) on the platform. This function allows AI Search to better understand device and site names queries entered, for more accurate answers. And the function obfuscates that identified data from our training data lakes.
Coming to a network near you!
Generative AI is incredibly powerful, and the industry is just scratching the surface in terms of real-world AIOps applications. We are really excited about LLM-powered AI Search, as it represents a huge benefit for today’s HPE Aruba Networking Central users but is also the first of many GenAI use cases we are working on as we move to the next generation of Central.
Stay tuned!
Want to experience our LLMs in action along with the other security, scaling, automation, and orchestration features HPE Aruba Networking Central has to offer? Sign up here for a future test drive.
K_Ramaswamy
Karthik Ramaswamy leads product strategy for HPE Aruba Networking. He focuses on helping customers drive their business priorities forward through network modernization delivered by Aruba ESP security, cloud, management, AIOps, automation, orchestration, policy, and user experience.
- Back to Blog
- Newer Article
- Older Article
-
AI-Powered
23 -
AI-Powered Networking
38 -
Analytics and Assurance
4 -
Aruba Unplugged
7 -
Cloud
9 -
Corporate
3 -
customer stories
4 -
Data Center
26 -
data center networks
19 -
digital workplace
2 -
Edge
4 -
Enterprise Campus
9 -
Events
5 -
Government
10 -
Healthcare
2 -
Higher Education
2 -
Hospitality
4 -
Industries
1 -
IoT
8 -
Large Public Venue
1 -
Location Services
3 -
Manufacturing
1 -
midsize business
1 -
mobility
17 -
Network as a Service (NaaS)
12 -
Partner Views
4 -
Primary Education
1 -
Retail
1 -
SASE
21 -
SD-WAN
12 -
Security
119 -
small business
1 -
Solutions
7 -
Technical
5 -
Uncategorized
1 -
Wired Wireless WAN
103 -
women in technology
2
- « Previous
- Next »