- Integrated Systems
- About Us
- Integrated Systems
- About Us
Supercomputers aid COVID-19 relief in real time
Learn how scientists and researchers worldwide have banded together in response to the COVID-19 virus. The portfolio of tools available to them is supersized, as supercomputing sites have made high-performance compute time available to COVID-19 researchers.
The most rewarding aspect of building supercomputers is seeing the work people do with these impressive machines, and that has never been truer than during the current COVID-19 pandemic. Scientists and researchers in the United States and around the globe are rallying in response to the novel coronavirus, using high performance computing to enhance our understanding of the virus and search for treatments to the disease.
In support of these efforts, supercomputing sites worldwide have made high-performance computing time available to these researchers.
A great example of this is the White House COVID-19 High Performance Computing Consortium, a collaboration between the U.S. federal government, private enterprise, research universities, and supercomputing leaders, which has arranged for over 30 supercomputers, plus public cloud resources, to be used in support of this research. The Consortium has been reviewing research proposals and matching them against available computing resources, with over 25 active projects running on millions of CPU cores and tens of thousands of GPUs.
HPE is a member of the White House COVID-19 High Performance Computing Consortium. HPE is providing software and applications expertise free of charge to help researchers port, run, and optimize essential applications to combat this pandemic.
Scientists and researchers are trying to quickly understand the virus and how it interacts with the human body and are looking for solutions that can transform the entire course of remediation. Supercomputing systems are key tools in this fight. From molecular modeling, to epidemiology, bioinformatics, proteomics, data analytics, machine learning, and more, supercomputing provides researchers the means to accelerate their time to insight, and to shorten the time it will take to get to market with COVID-19 therapies, vaccines, and more rapid test results.
- NERSC’s Cori supercomputer, a Cray XC40, is busy at work, both with the White House COVID-19 High Performance Computing Consortium and on the collaboration with the City of Hope’s COVID-19 genome analyses projects. A portion of NERSC’s Director’s Discretionary Reserve supercomputing time is being used to support COVID-19 research efforts. Since the creation of the consortium, NERSC has accumulated over six COVID-19 related projects that are currently underway. Blake Simmons, division director of Biological Systems and Engineering at Berkeley Lab, quoted on NERSC’s website said, “The collaboration with City of Hope will use the computing resources of NERSC to identify small molecules that can be used to assist in the diagnosis of COVID-19, design peptide libraries to mimic virus-host interactions, and annotate the virulence of the Italian strain as compared to the Chinese strain of COVID-19.” This research involves a simulation of the affinity of peptides and small molecules to human proteins – which will model physical reactions of the virus’ interactions with the human body and immune systems. All of the projects in motion at NERSC are explained in detail on their website. The astounding 1.25 million nodes available allows for a wide array of therapeutic research.
- At Argonne National Laboratory, researchers using the Cray Theta supercomputer have linked up with other supercomputers from around the country, including Oak Ridge National Laboratory’s Summit supercomputer, the Comet supercomputer at the University of California-San Diego, and the Stampede2 supercomputer at the Texas Advanced Computing Center. With their combined might, these supercomputers are powering simulations of how billions of different small molecules from drug libraries could interface and bind with different viral protein regions. With three major national labs researching the small molecule compounds for anti-viral drugs, the cure is well on its way to being found. Arvid Ramanathan, a computational biologist in Argonne’s Data Science and Learning division, says on Argonne’s website, “When we’re looking at this virus, we should be aware that it’s not likely just a single protein we’re dealing with — we need to look at all the viral proteins as a whole. By using machine learning and artificial intelligence methods to screen for drugs across multiple target proteins in the virus, we may have a better pathway to an antiviral drug.” Unique to COVID-19 research efforts, Argonne is also conducting a baseline simulation that would predict the outcomes of going about business as usual. This simulation would provide insight into the uncontrolled variables that may be affecting the spread of the virus in everyday life.
- University of Bristol’s Advanced Computing Research Centre is providing access to Blue Crystal, Catalyst and Isambard with research software engineers (RSEs) and technical staff giving their time to support the University's COVID-19 Emergency Research Group (UNCOVER). This group is focused on virus stability and survival, spread and immunity modeling, symptom development and effects, drug testing and how to manufacture a safe and effective vaccine. One of the many projects being done on HPC is In silico screening for approved drug repurposing that is working to identify the most promising potential treatments from existing drugs using computer-based methods to identify existing drugs that may interact with the virus and block its multiplication in infected cells. In another collaboration, UNCOVER is supporting an Oxford study to assess whether healthy people can be protected from COVID-19 with a new vaccine made by combining a weakened version of a common cold virus (adenovirus) from chimpanzees that has been genetically changed so that it is impossible for it to grow in humans with a gene that makes a protein from the COVID-19 virus called spike glycoprotein which plays an essential role in the infection pathway of the SARS-CoV-2 virus.
In addition to these organizations, many more across the globe are working on COVID-19 research on HPC systems or committing time on supercomputers to aid in relief. While not a complete list, the following offer a cross section of those working toward a cure:
- HLRS – High Performance Computing Center, Stuttgart
- KAUST – King Abdullah University of Science and Technology
- LANL – Los Alamos National Laboratory
- LLNL – Lawrence Livermore National Laboratory
- NCSA – National Center for Supercomputing Applications
- NIAID – National Institute of Allergy and Infectious Diseases
- PSC – Pittsburg Supercomputing Center
Leveraging compute capabilities to enable scientific collaboration across states and countries is not a new tactic for research. However, the donation of supercomputing time for this magnitude of collaborative research is. With compute capabilities that push the boundaries of science coupled with researchers and scientists behind these systems racing to find a solution, we can find comfort in the hope of a more normal tomorrow.
Learn more about how HPE technology and teams are helping customers and communities combat COVID-19.
HPE Fellow and CTO
VP, High Performance Computing and Mission Critical Solutions
Hewlett Packard Enterprise
Mike Woodacre is an HPE Fellow and CTO, VP for the company’s HPC/MCS business unit. He came to HPE via SGI where he was Chief Engineer for scalable systems. Mike is responsible for driving the long-term high performance computing product roadmap and architecture and is leading Memory-Driven Computing collaboration across HPE.