When I see comparisons like this, the first thought I have is not the benchmarks, but rather what the most “heroic” real-world calculation of the day would have been on something like the Cray-1, and how to replicate those calculations today on something like a RPi. Weather/climate models? Rad-hydro?
The fidelity would almost certainly be super low compared to modern FEA software, but it would be a fun exercise to try.
"The staff in the T-5 group included recruited women who had degrees in mathematics or physics, as well as, wives of scientists and other workers at Los Alamos. According to Their Day in the Sun: Women of the Manhattan Project, some of the human computers were Mary Frankel, Josephine Elliot, Beatrice “Bea” Langer, Augusta “Mici” Teller, Jean Bacher, and Kay Manley. While some of the computers worked full time, others, especially those who had young children, only worked part time.
General Leslie R. Groves, the Director of the Manhattan Project, pressured the wives of Los Alamos to work because he felt that it was a waste of resources to accommodate civilians. As told by Kay Manley, the wife of Los Alamos physicist John Manley, the recruitment of wives can also be traced to a desire to limit the housing of “any more people than was absolutely necessary.” This reason makes sense given the secretive nature of Los Alamos and the Manhattan Project. SEDs, a group of drafted men who were to serve domestically using their scientific and engineering backgrounds, also worked in the T division."
These are incredibly expensive even on today’s hardware. If you look through some of the unclassified ASCI reports from the early 2000s, 3D calculations of this equation set were implied to be leadership-class computations. At the time of the Cray, it must’ve been coarse-grid 1D as the standard, with 2D as the dream.
I've always been interested in this, I wonder how optimised the code was and if they used LUTs (Look Up Tables) as they did in the 80s for 3D calculations on the home computers.
Oh cool they got CrayOS working.
But still 1MB RAM, I remember getting the slow RAM 512KB update for my Amiga 500 in the early 90s.
One of the early customers was the European Centre for Medium-Range Weather Forecasts, so, wild guess, they probably used it for medium-range weather forecasts.
FWiW Australia used a CDC Cyber 205 for occassional weather modelling and other mathematical work in the early 1980s.
( There was a seperate dedicated weather computer, this one was used for 'other' jobs like speculative weather modelling, monster group algebraic fun, et al.)
In 1980, the successor to the Cyber 203, the Cyber 205 was announced. The UK Meteorological Office at Bracknell, England was the first customer and they received their Cyber 205 in 1981.
Numerically, I’m currently what this would have looked like. I’m talking about the governing equation set, discretization methods, data, etc. It would be a fun project to try and implement a toy model like that.
> It would be a fun project to try and implement a toy model like that.
If you really want a challenge, do it using pen, paper and a slide rule, like in the old days[1]. Just make sure to apply appropriate smoothing of the input data first[2].
I toured an NCAR (National Center for Atmospheric Research) facility in Boulder around 1979; got to sit on a seat on their Cray-1. So yes, weather and climate calculations.
3-D rendering? We had a super computing club in early 90s high school. I remember creating wireframe images, uploading then to a Cray XMP at Lawrence Livermore for the computation, and then downloading finished results.
The fidelity would almost certainly be super low compared to modern FEA software, but it would be a fun exercise to try.