Exascale supercomputers will revolutionise industry

Exascale supercomputers will revolutionise industry
x
Highlights

The fastest supercomputers, today, can make a quadrillion calculations per second.  For the uninitiated, a quadrillion is one followed by 15 zeros and has a prefix of ‘PETA.’  Following this system of prefixes, the next step up from ‘PETA’ is ‘EXA’ — meaning it’s a thousand times faster than the existing supercomputers.  

The fastest supercomputers, today, can make a quadrillion calculations per second. For the uninitiated, a quadrillion is one followed by 15 zeros and has a prefix of ‘PETA.’ Following this system of prefixes, the next step up from ‘PETA’ is ‘EXA’ — meaning it’s a thousand times faster than the existing supercomputers.

But do we need supercomputers?
The question is does our civilization need a computer that quick. And if we do come up with such super-supercomputer, what kind of world problems will that solve?

A safer, healthier world
For instance, in a typical cancer study today, millions of measurements are taken from the biopsy of a single tumor. Consequently, we produce loads of data and our computers today are not geared up to analyze and decipher such enormous amounts of information. Only an 'exa'scale supercomputer can help explain this and potentially cure cancer for the planet.

Or for that matter, in predicting earthquakes, mathematicians are introducing finer mesh refinement in the simulation of seismic wave propagation. The current earthquake assessment runs on petascale supercomputers and is still far from predicting the accurate timing of such an occurrence. This problem is apparently of exascale proportions, requiring a billion billion calculations per second. The solution can save millions of human lives from an earthquake or a tsunami.

On the same lines, scientists believe that we have intractable problems in areas of climate science, renewable energy, genomics, and geophysics. Some ideas include working on models that pair fuel chemistry and combustion engines to help reduce pollution. Or drastically improve weather prediction models by increasing spatial granularity to one square kilometer per grid compared to 400 square kilometers that we currently have.

Who are the big players?
There is a global race that is on between the US, China, European Union and Japan on who will get their first exascale supercomputer out. It is undoubtedly more benign than the nuclear arms race of the last century. If anything, it’s beneficial because these monster machines will allow all kinds of advanced scientific research for the benefit of the entire humankind.

What are the challenges?
But reaching the exascale regime is a tremendous technological challenge. Aggressive changes to supercomputer components are needed to keep gaining on computing speed. It is not necessary that the speed gains alone will make scientific applications excel, requiring the underlying software to be much more efficient as well.

Latest developments in the computing industry are guiding the path to designing these supercomputers. They lean heavily on parallel computing by using banks of chips to create machines with millions of processing units called ‘cores.’ However, 90 per cent of the power supplied is apparently used for data management and rest 10 per cent for computing, making these computers gigantic power-sucking machines. So, power consumption is the top-most problem with exascale computing. Experts estimate that the annual power requirement for such devices ranges between 180 and 425 megawatts. An estimated cost of keeping these supercomputers running is over $100 million per year.

When can we expect them?
America is far behind China in this race of supercomputing. China has operated the world’s most powerful computing machine since 2013. Last year, it unveiled the world’s fastest supercomputer, ‘the Sunway TaihuLight’ at 98 petaflops. In 2018, the government has set its sights on completing the world’s first prototype exascale computer.

China and possibly Japan are still likely to reach the exascale promised land first. But, if completed on schedule, A21, the exascale supercomputer could put the US back on the map. Intel and Cray, a US-based supercomputer maker, together are building A21 for the US. The expected date of launch of this supermachine is 2021. All of these nations are looking at timelines between 2020 to 2022 to unleash these supercomputing giants to work.

All in all, we are about to enter an era of trying to solve the next level problems across various walks of life. I watched a video from HP called 'Eighteen Zeros,' where the commentary ends with a great analogy of how we can explore our neighborhood if we walk around and if we have a bicycle we can ride across our city and understand it better. However, if we have a car or a motorbike, we drive across our country and explore our nation. And If we had an airplane we can examine the entire world. The jump from petascale to the exascale computing is similar. We are ready to explore the universe now.

By: Suresh Reddy
(The author is Chairman and Managing Director of Hyderabad-based Lycos Internet Ltd)

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS