UNLIMITED
“You don’t need a computer, let alone one with 75,000 processor cores, to think about the parts of a problem”
If there ever was a bigger computing scandal than Public Health England’s handling of the pandemic statistics in late 2020, I’ve not heard of it. If you’ve been held in solitary confinement and missed the story, British health researchers miscounted the number of cases, leading to policymaker confusion and headlines about “extra deaths”.
The problem, it turned out, was Excel. To be more precise, the incoming data looked perfectly good when opened in Excel but, in reality, was far from it. This came as quite a shock to me as I followed the story in autumn 2020, because over the past few years I’ve witnessed several exemplary supercomputer projects in the medical arena, all devoted to improving understanding and efficiency in the treatment of diseases and disorders.
All these projects have featured very large piles of hardware indeed. The Data & Analytics Facility for National Infrastructure (DAFNI) has 75,000 cores spread across many thousands of servers, all shoehorned into a hot little brick building sited across the road from the more glamorous and famous Diamond Light Source experiment.
Then there’s ARCHER, which gave academics across the UK access to a Cray supercomputer for six successful years, starting in late 2013, before being replaced by ARCHER2 in the summer of 2020. The idea is the same as the others: throw a ton of compute at a problem and you should be able to improve on the range of answers.
And if you want to see an example of collaboration across NHS sites, consider
You’re reading a preview, subscribe to read more.
Start your free 30 days