I have a slightly newer Infinite Reality R10000 sitting beside my desk here in Phnom Penh. A professor in Berkeley gave it to us and somehow we managed to get it to this side of the world. It uses enough power to run an apartment block and yes it is noisy. I will eventually gut it and install a six node linux cluster inside.
I used these boxes in Hong Kong in the early 90's, then later in Japan in the late 90's for a number of projects and I was still using an Indy as my main desktop until 2001.
SGI created hardware that almost took your breath away, you knew you were seeing a future that not many people had the privilege to see back then in person. To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...
>To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...
In the recent 3-5 years, there is a clear revival of the cyberpunk subculture online. Many related hobbyist websites appeared, many new cyberpunk-inspired independent art, music and games are composed, new communities are formed, etc.
Themes include a general nostalgia of the 80s, especially vintage computers, also the 90s early pre-Web 1.0.
The reason? We can clearly see. The lost future that never comes...
>The future is already here — it's just not very evenly distributed.
Nobody is denying it, indeed we see many related development.
However, it's not only about "the lost future that never comes", I think it's also due to the people being increasingly alienated about the current state of computing. Primarily, it's about the lost mentality of the future.
Cyberpunk promised a future where computing is the disruptive technology. Since 2006, the ever-increasing clock speed came to a halt. Since 2013, the general performance of Intel processors remained constant. Selling of PCs keeps declining. No major breakthrough in practical operating systems are made beyond Unix (http://herpolhode.com/rob/utah2000.pdf).
Cyberpunk promised a future where megacorps and governments being increasingly oppressive to the population with technology, it has occurred as predicted. Cyberpunk also promised the "cyberspace" is going to be the electronic frontier where technologists reclaimed liberty... Today, the damn web browser that runs all the craps written in Electron and JavaScript is hardly the "frontier", neither Facebook, Twitter and the endless timeline of buzz, which constitute 90% of the Internet activities count. Also, from 2000-2013, decentralization was killed, not built. Move importantly, today, we don’t even know how to waste time on the Internet anymore (https://news.ycombinator.com/item?id=17068138). Though, It, still, has arguably occurred to some extents too, but progressed slowly, even the basic tech like HTTPS was only deployed after Snowden.
A future where anything seemed possible and all of it was magical is definitely gone. But the new generation of developers, armed with decentralization, P2P, cryptography, and trustless system, no matter if it's going to be successful or not, would bring the Internet back to its cyberpunk ideals, revive the dream and set the history forward.
The recent game EXAPUNKS is definitely like this, and worth playing, although it's assembly language programming puzzles so might be a sort of busman's holiday for a lot of HN readers.
I don't most software can scale to 10 GPUs out of the box. AFAIK, it would be hard to even find a motherboard that would fit them. However, a company could conceivably buy a workstation with 256GB RAM, two 32-core Threadripper CPUs and four Nvidia 2080Ti. That would definitely put you a few years ahead of the average "consumer PC" or next-generation console.
Sidenote: I've read that John Carmack and id Software liked to develop on workstations that were "ahead of the curve" that way. It gave them an edge, in that they were able to develop future games for hardware that didn't yet exist, but knowing that consumer PCs would eventually catch up.
I think what made these SGI computers really amazing at the time is that there was no such thing as accelerated 3D graphics in the consumer market at the time (or much real-time 3D for that matter). They also had a cool Unix operating system with a UI that was way ahead of anything you could get on a consumer PC. I can also imagine that it was a much much more comfortable development environment than developing on say, MS-DOS, which didn't even have multitasking.
That's a good point but underselling it. My favorite thing about them was doing a single, system image that removed bottlenecks and complexity for developer at the same time. The PC's were using slow or redundant buses to connect high-speed components. SGI removed redundancies, used fast interconnects (GB/s), made them low-latency (microseconds vs milliseconds), and NUMA-enabled them. The last part meant that sending data to nodes didn't take middleware like MPI: you might just do a load and store instruction like on single node. Hardware took care of communications with cache-coherency. You did have to design for good locality to minimize moving data across nodes, though.
Add the other features like reliability (esp hot-swapping), servicability, and security (Trusted IRIX) to have some incredible machines. I always wanted inexpensive hardware with hot-swap, RAID, and something like NUMAlink connecting it. Never quite got that. One company did make a NUMA for AMD and Intel:
I guess that's the difference between a workstation that's designed for performance and versatility before cost, and a PC, which is made to be affordable first. When the PC industry started, it was very much about repurposing whatever low-cost CPUs and off-the-shelf components were available, and finding ways of packaging this into a semi-usable machine for less than $1000. Things have changed quite a bit since, but much of the do-it-cheap rushed-to-market type of compromises are still with us.
Sure, the PCs that started the PC industry — things like the Apple I and the MITS Altair — were indeed "about repurposing whatever low-cost CPUs and off-the-shelf components were available, and finding ways of packaging this into a semi-usable machine for less than $1000." But, long before 1993, most CPUs and components used in PCs were being produced specifically for PCs, with uses in things like lab equipment, industrial control, and workstations a rather smaller secondary market.
The company I work for builds machines like that for film post production. I’ve been writing software for one with 8 Titan Xp GPUs, and with its disks arrays and GPU expansion chassis its about the size of an Onyx, and draws considerably more power :-)
Unfortunately I wouldn’t say it feels like the future, more like a normal CentOS Linux desktop.
You’ll struggle to get a PC whose BIOS can handle much more than that too.
We used to build clusters for the same thing in the past, but that was largely standard supercomputing stuff but very similar to how the InfiniteReality machines were used. I believe our software once ran on Onyx machines in the dim & distant past.
So in short I wouldn’t say having loads of GPUs is enough to make it feel futuristic.
1) find some graphics problems which people say are not possible on any near-term hardware
2) study the algorithms and identify low level calculations which, if you could do orders of magnitude more of them, would allow you to solve the problem.
3) get a bunch of FPGAs and try to design a machine which can (very slowly) run that architecture
4) once you’ve got it working, slowly replace the FPGAs with ASICs
5) build a box with 16-64 of everything.
I would avoid polygons, since the current architectures are all extremely good at filling polygons. SDFs and raytracing are where you may find the “not on current gen” problems.
Take a look at the game Claybook. Fully raytraced visuals at 60fps against a dynamically updated SDF scene representation, complete with fluid sim for the main game mechanics. All on current hardware, including stuff most people consider “slow” these days (Xbox One).
An easy one would be: have each GPU raytrace a (say) 320x240 scene, each offset by fractions-of-a-pixel[0] or multiples-of-a-screen from each other, then have a final GPU stitch them together into a full-res video.
0: If you this with 60x1080 resolution, you might be able to replace the final GPU with a dumb hardware multiplexer, though that would make compositing painful at best.
As others have said, the whole point of these machines is that they were NOT simply creative combinations of current high-end commodity products. The whole system architecture, from the CPU, memory, I/O, to the graphics hardware, was custom designed. That's something you really don't see very much of these days.
Not in PCs, but I wonder if the iPhone would count as a largely custom designed computer. They do custom silicon for a lot of the system, and get much higher performance than commodity phones as a result. Perhaps the spiritual successor to these high end workstations in an odd way.
I think Linus mentioned something along those lines: on how the work on using Linux for supercomputers was the foundation of Linux for phones. It's a point Tannenbaum makes in Modern Operating Systems: what is old, is new, and what is new is soon to be old, as the bottlenecks move, ideas that were abandoned get resurrected.
And adding to the magic, SGI machines were a something most people would never get their hands on. A maxed out graphics workstation today is still just an "ordinary PC" but more expensive. A (big) upgrade, but not something with magic dust.
You could but it's unlikely to impress as much - some fundamental transitions only happen once. I think in this case the big advance is realtime 3d graphics with lighting and texture mapping. Perhaps a really fancy VR setup might have a similar effect although to me, it's less obvious that VR will be something every computer has, the way it seemed obvious that 3d graphics is something every computer will get much better at, in the early 90s.
It shows up to the host computer as one really big GPU. Of course, you're going to get worse performance than just a single Titan V because it can handle any game already and there's inevitably going to be latency added by doing work over NVLink/NVSwitch. Those massive GPU products are targeted toward offline rendering or machine learning applications, not so much realtime simulation.
I used these boxes in Hong Kong in the early 90's, then later in Japan in the late 90's for a number of projects and I was still using an Indy as my main desktop until 2001.
SGI created hardware that almost took your breath away, you knew you were seeing a future that not many people had the privilege to see back then in person. To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...