Last month, the White House announced the launch of the Covid-19 High Performance Computing Consortium. The initiative aimed to render researchers around the world with access to some of the most powerful computing resources and assist in their respective effort to combat against the ongoing coronavirus pandemic. This supercomputer alliance was formed between various public and private agencies and institutions including the US Department of Energy National Laboratories, the National Science Foundation, NASA, IBM, Google, Microsoft, and Massachusetts Institute of Technology (MIT) and so on.
Researchers involved in Covid-19 related studies are encouraged to submit their proposals to the consortium and to be matched with compatible computing resources provided by one of the partners. A panel of scientists and computing experts will be deployed to work alongside with the proposers in the allocation of computing assets and evaluation of the project’s public health benefits. Primarily, supercomputers provided by the consortium will facilitate the analyses of vast amount of data and shorten the time required by scientists to answer complex questions.
What can supercomputers do to fight coronavius?
In general, supercomputers have thousands, if not tens of thousands of processors that work collaboratively to execute massive calculations. They are equipped with artificial intelligence (AI) algorithms to handle huge amount of information. The consortium had gathered 30 of such systems, which is equivalent to over 400 petaflops of computing capacity. One petaflop of computing power, usually costs around $2-$3 million, will permit 1000 trillion operations every second.
More specifically, according to IBM, its most powerful supercomputer – Summit, allowed researchers at the Oak Ridge National Laboratory and the University of Tennessee to examine 8000 compounds in search for the ones that are most likely to assimilate with the coronavirus’ protein to remove its infectiousness. Summit’s expeditious computational power had sped up the discovery of 77 promising small-molecule drug compound that can now be tested experimentally to underpin their viability. Summit is able to streamline findings within days, while a normal computer may take months, provided if they can do the same job.
On the other hand, MIT’s supercomputing systems – Satori and Supercloud, not only have huge processing power but also come with extra-large memory. The institution believes they will be handy in looking at images from cryo-electron microscopy or the use of electron microscope to examine materials at extremely low temperatures. Under such condition, atoms will move slower and making images clearer.
What can we expect from the consortium?
In fact, the MIT team is developing a “decoy” receptor that can be taken as medicine to destroy Covid-19. Coronavirus binds with ACE2 receptors in one’s body to make them ill and the decoy will lure the virus to bind with the fake protein rather than the real one to prevent someone from getting sick. The challenge of developing such biologic drug is to ensure the decoy will not combine with other proteins in the body that will result in adverse side effects. This is when supercomputers play a part; if only rely on lab methods, the drug will probably take months to realize but now the MIT research team is expecting to test a decoy in mice by June.
Thus far, the consortium had received more than 50 research proposals from Europe, South Africa, Saudi Arabia to India. Of which, 18 research teams given free access to supercomputing resources are now working on projects. One of them is the German startup – Innoplexus, which focuses on AI and drug development. The company is now running permutations on five molecules that show potential to be turned into medication targeting at Covid-19. Hopefully, a low dose but highly effective solution can be created as soon as possible.