- Community Home
- >
- Solutions
- >
- Tech Insights
- >
- Welcome to the Exascale Era—where you’ll find new ...
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Forums
Discussions
Discussions
Discussions
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Receive email notifications
- Printer Friendly Page
- Report Inappropriate Content
Welcome to the Exascale Era—where you’ll find new converged systems for AI and HPC workloads
It’s time to recognize both AI and HPC accomplishments in exascale computing. (And it’s about more than speed. It’s an entirely new era of supercomputing.)
Workload convergence is a key definer of the Exascale Era and exascale technology. It’s an imperative for everyone—from the largest labs to the smallest enterprises—to extract the most value and insight out of rapidly growing data as possible. To succeed requires using a varying combination of compute-intensive HPC, analytics, and AI workloads in a single workflow, often in real time.
For AI and HPC, the infrastructures suited for these workloads are also increasingly converging. For example, as AI models continue to grow, these workloads will rely more and more on underlying HPC technologies. In fact, a Gartner study released in March 2020 shows that computational resources used in AI will increase 5x from 2018,1 making AI the top category of workloads driving infrastructure decisions.
Emerging from this tidal wave of convergence is infrastructure that is interoperable with today’s newest technologies and flexible enough to support a wide range of future processors and accelerator architectures. What’s more, these new converged systems for AI and HPC workloads must operate at scale seamlessly, have a cloud-like software environment, and deliver data at extreme scale. Taken together it’s clear why exascale isn’t about speed. It’s an entirely new era of computing.
What does it mean to be exascale? It means that HPE will deliver HPE Cray supercomputers that can perform a quintillion calculations every second. That’s 1018 FLOPS*—a billion billion or 1,000,000,000,000,000,000. You’d need every single person on Earth calculating 24 hours a day for over four years to do what an exascale supercomputer can do in a single second.
* floating point operations per second
In terms of what it means for humanity, exascale supercomputing means we’ll be able to make thus-far unimaginable progress across every field of inquiry. Not just because of the speed. But because HPC and AI workloads are converging and exascale technology will support these types of questions as never before. Already we’re seeing AI being applied to climate modeling and simulation, cancer research, and drug discovery. Exascale technology will accelerate these and more.
Still, every compute performance milestone promises to eclipse the previous in terms of what we can accomplish. Is there something different about the exascale threshold? Yes.
The Exascale Era isn’t just about a speed milestone or the few laboratories with the means to purchase a large-enough system. Today, global digital transformation and the urgent need to extract insight from rapidly growing data has prompted a major shift in computing requirements. Consider this number. International Data Corporation (IDC) predicts that worldwide data will grow 61% to 175 zettabytes by 2025.2 Given this kind of growth, status quo methods no longer work. Traditionally siloed workloads such as analytics, AI, simulation and modeling, and IoT are fusing into one business critical workflow operation at unprecedented scale and in real time. These requirements are driving the rapid evolution of supercomputing infrastructure. And they’re also enabling the broader adoption of supercomputing technologies to power innovation and digital transformation across every research and commercial data center.
In the Exascale Era, everyone must digitally evolve—and everyone must grapple with massive amounts of data. The same technologies that power the planet’s largest supercomputers will also power single cabinets for every level of business and research.
Acknowledging accomplishments past, present, and future
The Exascale Era supporting AI workflows is poised to change science, research, and our daily lives in every way. But we wouldn’t be standing at this threshold without the people and organizations asking what if, why not, and what’s next. These scientists, researchers, and visionaries are the ones changing the world for the better with computational science.
It’s all part of the good reasons why we celebrated Exascale Day on October 18 (1018). It marked an official day of recognition honoring science, supercomputing, and the people using both to extend what’s possible. Because when brilliant minds and brilliant technology converge, we’ll solve critical, global problems that matter to all of us.
Want to learn more?
Download this technical whitepaper: Accelerating AI adoption for the modern data-driven enterprise
Tami Wessley
Hewlett Packard Enterprise
twitter.com/HPE_AI
linkedin.com/showcase/hpe-ai/
hpe.com/us/en/solutions/artificial-intelligence.html
1 Gartner Predicts the Future of AI Technologies
2 Data Age 2025: The Digitization of the World From Edge to Core
Tami_Wessley
Tami is the marketing play lead for HPE artificial intelligence, responsible for delivering consistent, on-strategy messaging while leading the orchestration and execution of key activation moments, thought leadership, and programs designed to support the company’s overall AI business objectives.
- Back to Blog
- Newer Article
- Older Article
- Amy Saunders on: Smart buildings and the future of automation
- Sandeep Pendharkar on: From rainbows and unicorns to real recognition of ...
- Anni1 on: Modern use cases for video analytics
- Terry Hughes on: CuBE Packaging improves manufacturing productivity...
- Sarah Leslie on: IoT in The Post-Digital Era is Upon Us — Are You R...
- Marty Poniatowski on: Seamlessly scaling HPC and AI initiatives with HPE...
- Sabine Sauter on: 2018 AI review: A year of innovation
- Innovation Champ on: How the Internet of Things Is Cultivating a New Vi...
- Bestvela on: Unleash the power of the cloud, right at your edge...
- Balconycrops on: HPE at Mobile World Congress: Creating a better fu...
-
5G
2 -
Artificial Intelligence
101 -
business continuity
1 -
climate change
1 -
cyber resilience
1 -
cyberresilience
1 -
cybersecurity
1 -
Edge and IoT
97 -
HPE GreenLake
1 -
resilience
1 -
Security
1 -
Telco
108