There has been a lot of excitement around the first picture of a black hole (and rightfully so!) and I’ve been trying to nail down the specifics of the compute infrastructure that was used (e.g., HPC? Machines? How much memory?) There was a mention of the amount of data used (in the Terabytes, I believe) but I couldn’t find any spec for how it was processed, and where. I can imagine given all the GUI and image processing needed, maybe they did it on local machines and waited it out. However, I can also imagine there was a lot of computation needed to generate the image, which would do well on a supercomputer.
I did find this post on reddit with a link to all the papers: https://www.reddit.com/r/HPC/comments/bcndgt/imaging_a_blackhole/, and notably, the third link talks about the data pipelines.
If anyone has a hint, or knows someone out there that could answer this question, the nerdlings of AskCI want to know!