Hacker News new | past | comments | ask | show | jobs | submit login

I'm sure there were offline rendering and 3D graphics workstation people saying the same about the comparatively crude work he was doing in the early 90s...

Obviously both Carmack and the rest of the world has changed since then, but it seems to me his main strength has always been in doing more with less (early id/Oculus, AA). When he's working in bigger orgs and/or with more established tech his output seems to suffer, at least in my view (possibly in his as well since he quit both Bethesda-id and Meta).

I don't know Carmack and can't claim to be anywhere close to his level, but as someone also mainly interested in realtime stuff I can imagine he also feels a slight disdain for the throw-more-compute-at-it approach of the current AI boom. I'm certainly glad he's not running around asking for investor money to train an LLM.

Best case scenario he teams up with some people who complement his skillset (akin to the game designers and artists at id back in the day) and comes up with a way to help bring some of the cutting edge to the masses, like with 3D graphics.




The thing about Carmack in the 90s... There was a lot of research going on around 3d graphics. Companies like SGI and Pixar were building specialized workstations for doing vector operations for 3d rendering. 3d was a thing. Game consoles with specialized 3d hardware would launch in 1994 with the Sega Saturn and the Sony Playstation (in Japan only for one year)

What Carmack did was basically get a 3d game running on existing COMMODITY hardware. The 386 chip that most people used for their excel spreadsheets did not do floating point operations well, so Carmack figured out how to do everything using integers.

May 1992 -> Wolfenstein 3d releases December 1993 -> Doom releases December 1994 -> Sony Playstation launches in Japan June 1996 -> Quake releases

So Wolfenstein and Doom were actually not really 3d games, but rather 2.5 games (you can't have rooms below other rooms). The first 3d game here is actually Quake which also eventually also got hardware acceleration support.

Carmack was the master of doing the seeminly impossible on super constrained hardware on virtually impossible timelines. If DOOM released in 1994 or 1995, would we still remember it in the same way?


> If DOOM released in 1994 or 1995, would we still remember it in the same way?

Maybe. One aspect of Wolfenstein and Doom's popularity is that it was years ahead of everyone else technically on PC hardware. The other aspect is that they were genre defining titles that set the standards for gameplay design. I think Doom Deathmatch would have caught on in 1995, as there really were very few (just Command and Conquer?) standout PC network multiplayer games released between 1993 and 1995.


I guess the thing about rapid change is... it's hard to imagine what kind of games would exist in a DOOMless world in an alternate 1995.

The first 3d console games started to come out that year, like Rayman. Star Wars Dark Forces with its own custom 3d engine also came out. Of course Dark Forces was, however, an overt clone of DOOM.

It's a bit ironic, but I think the gameplay innovation of DOOM tends to hold up more than the actual technical innovation. Things like BSP for level partitioning have slowly been phased out of game engines, we have ample floating point compute power and hardware acceleration ow, but even developers of the more recent DOOM games have started to realize that they should return to the original formula of "blast zombies in the face at high speed, and keep plot as window dressing"


Sort of in the middle, id games always felt tight. The engines were immersive not only because of graphics, but basic i/o was excellent.


> but even developers of the more recent DOOM games have started to realize that they should return to the original formula of "blast zombies in the face at high speed, and keep plot as window dressing"

There's still a lot of chatter breaking the continuity. In the original, the plot was entirely made up of what you were experiencing directly.


In the ending of the original game, you kill the demon spider brain robot thing and the demons kill your bunny rabbit. That's the plot


> Things like BSP for level partitioning have slowly been phased out of game engines

Hey, can you say more / do you have a link about this? I mean, for what reason are BSP trees phased out, and what are they replaced with? (quad/oct tree? AABB trees? or something entirely different?)


The pipeline bottlenecks all changed in favor of bruteforcing the things that BSP had been solving with an elegant precomputed data structure - what BSP was extremely good at was eliminating overdraw and getting to where the scene could render exactly the number of pixels that were needed and no more. It's optimized around small, low-detail scenes that carefully manage occlusion.

More memory, bandwidth and cache means that more of your solutions are per-pixel instead of per-vertex and you can tolerate overdraw if it means you get to have higher polycount models. Likewise, the environment collision that was leveraged by the BSP process reduced the number of tests against walls, but introduced edge cases and hindered general-purpose physics features. Scaling physics leads in the direction of keeping the detailed collision tests at their original, per-poly detail, but doing things with sorting or tree structures to get a broadphase that filters the majority of tests against AABB or sphere bounds.

On a Wii(original Wii) 3D action game I helped ship, we just rendered the whole level at once, using only the most basic frustum culling technique; the hardware did the lifting, mostly through the z-buffer.


Adding to this, the nice thing about the bsp partitioning was you could also leverage it to make off screen monsters go to sleep or reduce their tick rate. Was helpful for optimizing AI as well as rendering. DOOM not only had some of the first pseudo 3d but also huge numbers of enemies... something that a lot of other games still cut down on


Read the quake pvs article linked to in this thread.

On top of my head as I remember it.. one of the reasons for Quake's use of a bsp was to allow back to front rendering of the world geometry without the use of a zbuffer. This was required to get decent performance with the software rasterizer.

I'm not 100% sure what's most commonly used these days, but for a large open world requiring data streaming, I could see the use for something like an octree and even portals.


> The first 3d console games started to come out that year, like Rayman.

Rayman was a 2D game.


I'm misremembering Rayman for the Sega Dreamcast. Looking at wikipedia, I now see that there weren't many games in 1995 even on the new consoles that really leveraged the 3d hardware. The PC actually had more such games despite many PCs lacking hardware acceleration for 3d rendering or even significant floating point capabilities. There's Sega Rally Racing for the Saturn, Virtua Fighter, Tekken...

Perhaps it's really 1996 that's the real turning point (with Mario 64 coming out), which makes DOOM about 3 years ahead of its time. And of course id shipped Quake that year....


It is. I have experienced others reference it as an early 3d Title too.

Great work by that team! Seriously. The graphics are great, and for the time period, excellent enough to be lumped in with the real 3d games.


> So Wolfenstein and Doom were actually not really 3d games, but rather 2.5 games (you can't have rooms below other rooms). The first 3d game here is actually Quake

Ultima Underworld is a true 3D game from 1992. An incredibly impressive game, in more ways than one.


The world seems to have rewritten history, and forgotten Ultima Underworld, which shipped prior to Doom..


Couple "3D" games shipped before Doom. Battlezone comes to mind.

The difference is that id owned the natural progression (from Wolf3D through Doom to Quake) and laid foundation to what we call today a FPS genre.


I think that's because it had such high system requirements that very few people could run it, unlike Wolfenstein 3D and Doom.


It was just about playable on a 25MHz 386. We used to put up with frame rates that would make your eyes bleed back in the early 90s.

https://www.youtube.com/watch?v=3VdRXgWoShM


Exactly, I played through underworld on a 386 16mhz. Being an rpg, the lower fps was much more tolerable than in Doom, which was in fact unplayable.


I'm sorry but that is not very playable even when it only renders to a quarter of the screen, especially compared to Wolfenstein 3D on similar hardware. It was also quite clunky in terms of interface. The guy on the video spends like 3 minutes trying to pick up a sack on the floor before just giving up.


Hardware changes a lot in the time it takes to develop a game. When I read his plan files and interviews, I realized he seemed to spend a lot of time before developing the game thinking about what the next gen hardware was going to bring. Then design the best game they could think of whike targeting this not-yet-available hardware.


If DOOM released in 1994 or 1995, would we still remember it in the same way?

I think so, because the thing about DOOM is, it was an insanely good game. Yes, it pioneered fullscreen real-time perspective rendering on commodity hardware, instantly realigning the direction of much of the game industry, yadda yadda yadda, but at the end of the day it was a good-enough game for people to remember and respect even without considering the tech.

Minecraft would be a similar example. Minecraft looked like total ass, and games with similar rendering technology could have been (and were) made years earlier, but Minecraft was also good. And that was enough.


But also, he didn't do the technically hardest and most impressive part, Quake, on his own. IIUC he basically relied on Michael Abrash's help to get Quake done (in any reasonable amount of time).


Realizing that he needed Abrash (and aggressively recruiting him) could easily be seen as the most impressive thing he did to make Quake happen


I would say his multiple technical feats and phenomenal output are more impressive.


> his main strength has always been in doing more with less

Carmack builds his kingdom and then runs it well.

I makes me wonder how he would fare as an unknown Jr. developer with managers telling him "that's a neat idea, but for now we just need you to implement these Figma designs".


A key aspect of the Carmack approach (or similar 'smart hacker' unconventional career approach) is avoiding that situation in the first place. However, this also carries substantial career, financial and lifestyle risks & trade-offs - especially if you're not both talented enough and lucky enough to hit a sufficiently fertile oppty in the right time window on the first few tries.

Assuming one is willing to accept the risks and has the requisite high-talent plus strong work drive, the Carmack-like career pattern is to devote great care to evaluating and selecting opptys near the edges of newly emerging 'interesting things' which also: coincide with your interests/talents, are still at a point where a small team can plausibly generate meaningful traction, and have plausible potential to grow quickly and get big.

Carmack was fortunate that his strong interest in graphics and games overlapped a time period when Moore's Law was enabling quite capable CPU, RAM and GFX hardware to hit consumer prices. But we shouldn't dismiss Carmack's success as "luck". That kind of luck is an ever-present uncontrolled variable which must be factored into your approach - not ignored. Since Carmack has since shown he can get very interested in a variety of things, I assume he filtered his strong interests to pick the one with the most near-term growth potential which also matched his skills. I suspect the most fortunate "luck" Carmack had wasn't picking game graphics in the early 90s, it was that (for whatever reasons) he wasn't already employed in a more typical "well-paying job with a big, stable company, great benefits and career growth potential" so he was free to find the oppty in the first place.

I had a similarly unconventional career path which, fortunately, turned out very well for me (although not quite at Carmack's scale :-)). The best luck I had actually looked like 'bad luck' to me and everyone else. Due to my inability to succeed in a traditional educational context (and other personal shortcomings), I didn't have a college degree or resume sufficient to get a "good job", so I had little choice but to take the high-risk road and figure out the unconventional approach as best I could - which involved teaching myself, then hiring myself (because no one else would) and then repeatedly failing my way through learning startup entrepreneurship until I got good at it. I think the reality is that few who succeed on the 'unconventional approach' consciously chose that path at the beginning over lower risk, more comfortable alternatives - we simply never had those alternatives to 'bravely' reject in pursuit of our dreams :-).


  > "makes me wonder how he would fare as an unknown Jr. developer with managers telling him (...)"
he would probably write an open letter and left Meta. /s




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: