Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled, and supercomputer facilities have even begun preemptively restricting visitor access. But tech is striking back, and hard: day by day, more and more organizations are dedicating supercomputing power toward the effort to diagnose, understand and fight back against COVID-19.
Testing for COVID-19
Before supercomputers began spinning up to find a cure, researchers were scrambling to simply diagnose the disease as cases in China’s Hubei province spun out of control.
With limited (and rapidly iterated) test kits available, Chinese researchers turned to AI and supercomputing for answers. They trained an AI model on China’s first petascale supercomputer, Tianhe-1, with the aim of distinguishing between the CT scans of pneumonic patients with COVID-19 and patients with non-COVID-19 pneumonia.
In a paper, the researchers reported nearly 80% accuracy when testing this method against external datasets, dramatically outperforming early test kits as well as human radiologists:
The Summit supercomputer. The big gun was brought out early:
One of the first systems to join the fight was the world’s most powerful publicly-ranked supercomputer: Summit. Oak Ridge National Laboratory (ORNL) pitted Summit’s 148 Linpack petaflops of performance against a crucial “spike” protein on the coronavirus that researchers believe may be key to disabling its ability to infect. Testing how various compounds interact with key virus components can be an extremely time-consuming task, so the researchers – a team from ORNL’s Center for Molecular Biophysics – were granted a discretionary time allocation on Summit, which allowed them to cycle through 8,000 compounds within a few days.
Using Summit, the research time identified 77 compounds that may be promising candidates for testing by medical researchers. “Summit was needed to rapidly get the simulation results we needed. It took us a day or two whereas it would have taken months on a normal computer,” said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. The researchers are preparing to repeat the study using a new, higher-quality model of the spike protein recently made available.
Major organizations have opened their doors – and wallets – for coronavirus computing proposals
Last week, the National Science Foundation (NSF) issued a Dear Colleague Letter expressing interest in proposals for “non-medical, non-clinical-care research that can be used immediately to be understand how to model and understand the spread of COVID-19; to inform and educate about the science of virus transmission and prevention; and to encourage the development of processes and actions to address this global challenge.” Two days later, it issued another Dear Colleague Letter specifically inviting rapid response research proposals for COVID-19 computing activities through its Office of Advanced Cyberinfrastructure. As a complement to existing funding opportunities, the NSF also invited requests for supplemental funding.
Even with their quick response, though, the NSF weren’t the first to open their pocketbooks. In January, the European Commission announced a €10 million call for expressions of interest for projects that fight COVID-19 through vaccine development, treatment and diagnostics. Then, on the same day as the latest NSF Dear Colleague Letter, they announced an additional €37.5 million in funding.
€3 million of this funding has already been allocated to the Exscalate4CoV (E4C) program in Italy – one of the hardest-hit countries. E4C is operating through Exscalate, a supercomputing platform that uses a chemical library of over 500 billion molecules to conduct pathogen research.
Specifically, E4C is aiming to identify candidate molecules for drugs, help design a biochemical and cellular screening test, identify key genomic regions in COVID-19 and more.
Beyond E4C, the EU also highlighted “on-demand, large-scale virtual screening” of potential drugs and antibodies at the HPC Centre of Excellence for Computational Biomolecular Research, as well as “prioritized and immediate access” to supercomputers operated by the EuroHPC Joint Undertaking.
Presumably, as the NSF and European Commission funding opportunities are leveraged, high-performance computing will play an increasingly large role in the fight against the coronavirus.
Post by Jai Krishna Ponnappan
Testing for COVID-19
Before supercomputers began spinning up to find a cure, researchers were scrambling to simply diagnose the disease as cases in China’s Hubei province spun out of control.
With limited (and rapidly iterated) test kits available, Chinese researchers turned to AI and supercomputing for answers. They trained an AI model on China’s first petascale supercomputer, Tianhe-1, with the aim of distinguishing between the CT scans of pneumonic patients with COVID-19 and patients with non-COVID-19 pneumonia.
In a paper, the researchers reported nearly 80% accuracy when testing this method against external datasets, dramatically outperforming early test kits as well as human radiologists:
The Summit supercomputer. The big gun was brought out early:
One of the first systems to join the fight was the world’s most powerful publicly-ranked supercomputer: Summit. Oak Ridge National Laboratory (ORNL) pitted Summit’s 148 Linpack petaflops of performance against a crucial “spike” protein on the coronavirus that researchers believe may be key to disabling its ability to infect. Testing how various compounds interact with key virus components can be an extremely time-consuming task, so the researchers – a team from ORNL’s Center for Molecular Biophysics – were granted a discretionary time allocation on Summit, which allowed them to cycle through 8,000 compounds within a few days.
Using Summit, the research time identified 77 compounds that may be promising candidates for testing by medical researchers. “Summit was needed to rapidly get the simulation results we needed. It took us a day or two whereas it would have taken months on a normal computer,” said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. The researchers are preparing to repeat the study using a new, higher-quality model of the spike protein recently made available.
Major organizations have opened their doors – and wallets – for coronavirus computing proposals
Last week, the National Science Foundation (NSF) issued a Dear Colleague Letter expressing interest in proposals for “non-medical, non-clinical-care research that can be used immediately to be understand how to model and understand the spread of COVID-19; to inform and educate about the science of virus transmission and prevention; and to encourage the development of processes and actions to address this global challenge.” Two days later, it issued another Dear Colleague Letter specifically inviting rapid response research proposals for COVID-19 computing activities through its Office of Advanced Cyberinfrastructure. As a complement to existing funding opportunities, the NSF also invited requests for supplemental funding.
Even with their quick response, though, the NSF weren’t the first to open their pocketbooks. In January, the European Commission announced a €10 million call for expressions of interest for projects that fight COVID-19 through vaccine development, treatment and diagnostics. Then, on the same day as the latest NSF Dear Colleague Letter, they announced an additional €37.5 million in funding.
€3 million of this funding has already been allocated to the Exscalate4CoV (E4C) program in Italy – one of the hardest-hit countries. E4C is operating through Exscalate, a supercomputing platform that uses a chemical library of over 500 billion molecules to conduct pathogen research.
Specifically, E4C is aiming to identify candidate molecules for drugs, help design a biochemical and cellular screening test, identify key genomic regions in COVID-19 and more.
Beyond E4C, the EU also highlighted “on-demand, large-scale virtual screening” of potential drugs and antibodies at the HPC Centre of Excellence for Computational Biomolecular Research, as well as “prioritized and immediate access” to supercomputers operated by the EuroHPC Joint Undertaking.
Presumably, as the NSF and European Commission funding opportunities are leveraged, high-performance computing will play an increasingly large role in the fight against the coronavirus.
Post by Jai Krishna Ponnappan