The march of progress?
This is an early draft
Supercomputing is a decades-old tool for creating new knowledge. What makes it relevant to today’s world? In particular, is it really worthwhile for the public to fund?
There are many phenomena that are difficult or impossible to test experimentally. We want to be able to understand what happens when outbreaks occur and whether a new design for a fusion reactor is likely to be able to withstand the pressures expected of it. Computer simulation allows us to try things out without needing to build things through trial and error. It’s pretty difficult to replay the Big Bang in real-life, but why not attempt that in a computer?
In principle, any computer will do. The code to replay the universe is free and open source. You can run it on your laptop today.
In practice, any given computer will simply be too slow. To finish the report this month, we need data next week. Supercomputing makes this practical. It allows computer simulations to be of sufficiently high resolution quickly enough for the data to be meaningfully interpreted.
Why is high-resolution important? Consider trying to identify faces in a blocky, pixelated image.There is no “zoom and enhance" for digital images. The same applies to computer simulations.
Large scientific instruments cluster researchers together. In order for countries to be able to maximise their collective investments in R&D, actors in that ecosystem need to talk to each other.
In ten years’ time, every programmer will need to be able to write applications that will use dozens of cooperating CPUs. The underlying technology to build these applications will emerge from the areas of computing that face that pressure today, namely scientific computing.
The CPUs that live within high performance computing clusters are the same. The compilers are the same.
…more to come