Hacker Newsnew | past | comments | ask | show | jobs | submit | myfavoritetings's commentslogin

After reading “The Making Of The Atomic Bomb” I came away with the impression that development of nuclear bombs was 100% inevitable after 1938 when the German scientists proved splitting uranium isotope could sustain a chain reaction. All of the top physicists in the world instantly knew this could be used to create a super weapon. Every power in WWII had a nuclear program but only 1 had the resources to execute on it. Being the only country with a nuke is basically a checkmate on the world order and game theory demands it be created once it was known it was possible


I encourage you to try sitting in a wooden chair for 8 hours a day every day and reconsider


I do! I genuinely find it much more comfortable than any office chair I ever had.


I use a wooden chair for my home office and I find it more comfortable than office chairs, although it may not be a fair comparison since I spend most of the time squatting on it.


I do have an Aeron in my office. But since I largely stopped having video calls I confess I mostly work at my kitchen/dining room table in a wooden chair.


Up until say 70 years ago, wooden chairs were probably the norm everywhere.


Have you forgotten what primary and secondary school was like?


Kind of a dramatic title. If you're so concerned about it, couldn't you just make another proton email and link them to each other as recoveries?


> couldn't you just make another proton email and link them

I thought about this, but no, that would fully compromise the data stored in both accounts. This is because the new recovery message would be intercepted by Proton, relayed to the attacker, and then it's game over, first for the first account, and then similarly for the second account. The encryption of Proton applies only to historical messages.


Factually incorrect. If Nvidia dropped TSMC, TSMC could still sell to apple, AMD and others. If TSMC dropped Nvidia, Nvidia would literally have no other options for cutting edge nodes.


TSMC already sells to those companies, so I don't think that addressed the scenario. I would imagine that other chip fabs would be able to pick up the slack, including many that are purchasing from ASML already and are on track to begin matching TSMC from a node perspective in a year or 2.


isn't that just mocking with extra steps?


You could call the data you generate for the tests "mocks".

But they really aren't "mocks" in the sense of behavioral mocks via IoC/DI and you don't need to manipulate them via some kind of interface in order to put them into the right state for your particular tests.

There are some extra steps, but you get extremely simple and reliable tests in return.

In many(!) cases you already have a data interface, especially with HTTP/REST APIs. All you need to do is simply not bury the IO call down the stack and maybe describe the failure conditions as plain data in your signature and voila.

(This is not a replacement for higher order testing like, manual, E2E or integration tests. But it certainly beats unit testing with mocks IMO.)


It is simple really, there are more suckers paying high credit card interest than there are people collecting credit card rewards. As someone who always pays your bill off every month any rewards you collect are essentially paid for by people with interest payments


"Isn't the more parsimonious explanation that our culture is now composed of poor critical thinkers who are poorly educated, lack sophistication and nuance, and accordingly have terrible taste?"

Hasn't this always been true though? Widespread public education systems did not exist for 99.9999% of human history. How could it be that education is more present in the world than it has ever been in history yet we somehow have worse critical thinking skills? Blaming the ills of society on education doesn't make much sense when we've had societies much longer than we've had public education.


I don't read it as blaming the education system. In my mind it's an indictment of all the trends of the past 20 years in the public education system: austerity, lack of autonomy for teachers, heavy reliance on metrics and standardized testing to establish success. What we're getting now out the other end of that is a lack of critical thinking and a reduction in traits that can't be quantified like critical thinking.

Basically if our education system sucks, it's because we've spent 20 years cutting corners, cheaping out, over-relying on metrics, and enforcing top-down control over teaching and curriculum. No wonder it sucks. Our public policy has been to stamp the outliers out of the system and crush it into mediocre mush.


The illiterate weren't able to communicate outside of their immediate vicinity. They simply weren't part of the conversation


Illiterate couldn't read, but not necessarily didn't have critical thinking skills.


And how would they tell anyone not right next to them what they thought? Word of mouth certainly isn't an answer, ever played telephone?


Critical thinking does not depend on sharing those thoughts. Not sharing a thought does not make it less critical and vice versa.

Let alone that with your train of thought critical thinking must have not existed before literacy became a thing.


Why do you equate widespread public education systems with amount of critical thinking?


I didn't, the comment I responded to did "poor critical thinkers who are poorly educated"


> Blaming the ills of society on education doesn't make much sense when we've had societies much longer than we've had public education

Why? Education may easily have negative impact. Modern education was created to teach people to read instructions how to operate factory machinery. Critical thinking is not needed for that. In fact, society without much critical thinking is easier to work with.


I think what they're saying is an 82 year old is more likely to make it to 83 than an 81 year old is


AI is a lot more than LLMs just because they don't have a public LLM doesn't mean they don't have competent AI strategy. Most of their advancements are around image and video. A lot of companies that are "leaders" in AI right now have no path to profitability for their features because they're not actually that useful


> just because they don't have a public LLM

They have released OpenELM.

https://machinelearning.apple.com/research/openelm

https://huggingface.co/apple/OpenELM


They absolutely do not have a competent AI strategy and are many years behind the competition.


FWIW, most games before early 2000s built all their tooling from scratch as there wasn't off the shelf engines to use. Unreal engine came out in 98 and Source in 2004


Jedi Knight 2 came out in 2002 and the original Call of Duty came out in 2003, both running heavily modified versions of John Carmack's Quake III engine



Valve originally started in 1996 by licensing the Quake engine for Half-Life.


id sold a few Doom and a bunch of Quake 1/2/3 licenses back in the day. Off the top of my head: Heretic and Hexen used the Doom engine, their sequels used the Quake and Quake 2 engines respectively. Strife was an FPS RPG that was Doom based. Half-Life started out as a HEAVILY modified Quake engine and rumor has it that there is still a bit of Quake code in Source. Duke Nukem Forever started out on Quake before moving to Unreal.


Call of Duty and Medal of Honor both used id Tech 3 (aka the Quake 3 engine), as did some of the Star Wars games, etc. Unfortunately, The Source Engine came out around the time of id Tech 4, and it really took the reins, with Unreal Engine hitting a stride thereafter. https://en.wikipedia.org/wiki/Id_Tech_3#Games_using_a_propri...


I heard an interview about Doom port to hmm 3DO? and it cost them one time payment of $50K cash money to outright buy Doom assets + game engine license.


There were game engines in the 90s -- Quake Engine, Build Engine, Unreal Engine.. to name some of the popular ones.

You also had GoldSrc, a modified Quake Engine for Half-Life 1.

Many companies created their own and was not available outside the business. Dark Engine used in the Thief:Dark Project games is an example. Think of companies like Nintendo having their own engine which powered games like Mario 64 and, I believe, Ocarina of Time was a modified version of that.

The "off the shelf" engines available at that time (like Unreal or Quake engine) might have been decent for certain type of games.

Today, you could say many games could be done with Unity, Godot, or Unreal. There are still companies today using their own.


Wasn’t Quake 1 reasonably off the shelf for its time? Released in ‘96, it did have a map editor and a number of games were built with it.

Source itself being built from Quake 1…


idTech 2 was specifically built for Quake 1, and only later it was licensed to other developers as well. So it was not an off-the shelf solution id could simply take for building Quake. It was tailored for that game.


They specifically built Quake to sell the engine. They knew it would be difficult to actually sell an engine without a successful title.


Quake 3 was probably the first time the id really built a game to demonstrate an engine for the sake of engine sales. And then again with Rage.


Interesting, I’ve never heard about that. Do you have a source?


I don't think they specifically understood the idea of a "game engine" as the core product at the time. But there are plenty of references if you Google a bit that Quake was designed for modders due to the popularity of DOOM mods - so developer experience was absolutely taken into account from the start.

They had already done licensing deals for the DOOM engine at that point, including the greatest game of all time "Chex Quest."

https://en.wikipedia.org/wiki/Chex_Quest


Yeah, this is why Quake's logic for a lot of game things - monsters, weapons, moving platforms - is written in a byte-code interpreted language (QuakeC). The idea was to separate it from the engine code so modders could easily make new games without needing access to the full engine source. (And QuakeC was supposed to be simpler as a language than C, which it... is, but at the cost of weird compromises like a single number type (float) which is also used to store bitfields by directly manipulating power of two values. Which works, of course, until your power of 2 is big enough to force the precision to drop below 1...)


Keep in mind that they had developed a good relationship with Raven Software and made a good chunk of money off of their use of idTech 1 for Heretic.


Won't really work on consoles like PS1 and PS2.


I think some of the early FPS engines were re-used, with Rise of the Triad using the Wolfenstein 3d one.


Engines that come to mind: XnGine (Daggerfall, Terminator, ...), Dark Engine (Thief, System Shock 2), Build (Duke 3D, Blood...). Yes, they existed before the 2000s, but the difference to today is that there were many engines being reused for a handful of games at most. Today it's few engines running most games.


Battlefield, gta, alan wake, ms flightsim, cyberpunk, hogsward (unreal) all top games with different engines. Agree that unreal engine has many games, but plenty of alternatives


That's a pretty small selection of well-known AAA games. Those few examples really don't change the general skew towards using 3rd party engines these days vs. few games doing something like that in the 90s. And in fact, most of these engines have also been reused between games (with heavy modifications of course - e.g. Remedy's Northlight engine has been evolving since Alan Wake 1).


Build engine duke nukem also used in many other games like blood etc. Same for quake engine. Even doom engine was used in games like hexen. Doom was also an evolution of the wolfenstein engine. Quake 1 to later quake engines all evolutions and used in a lot of other games. These are all 3d engines. On the nes, snes and sega machines the same platform engine was reused in 1000s of games. Same for sound engines, physics engines etc. My point is. I dont think there is a lot of difference. Innovation still happening today. Not everything is Unreal.


> On the nes, snes and sega machines the same platform engine was reused in 1000s of games.

No, this isn't true. Almost all games were bespoke back in that era - the machines simply weren't powerful enough to allow for competitive, flexible game engines. An individual development house might have a code library or base that they'd iterate on, but there was little sharing between different companies (much less reuse by the thousands).

By the 16-bit era sound engines did tend to be widely reused, though (e.g. GEMS).


Honestly it seems like it always has: there are a handful of dev houses using their own engine for a spread of games (e.g. EA with Frostbite, Ubi with Anvil, Rockstar with RAGE, Bungie with whatever they call the Halo/Destiny engine these days), then UE or Unity are out there mass licensed for a whole bunch of stuff, then the few less widely licensed engines like Source.


That's not how it always has been. Licensing engines was virtually unknown 30 years ago (and when it happened at all it was within a very narrow range for making games in the same genre, more like asset swaps and level packs than outright new games), and new and exciting in the '00s.


Yeah sorry I meant from, like, 2000-ish and on.


I wanted to mention the odd adventure game Normality as using the Doom engine, but I can't find a source


I'm pretty sure Normality ran on the in-house Gremlin Interactive engine that powered the excellent Realms of the Haunting.


+1 for the Daggerfall mention. I haven't thought about that game in 1000 years and loved it.


You’ve reminded me of Return to Castle Wolfenstein. It was very limited in terms of maps and possible multiplayer only. It was fantastic.


That reminds me of how Torque [0] debuted in 2001 using Tribes 2.

[0] https://en.wikipedia.org/wiki/Torque_(game_engine)


Torque was pretty instrumental in kick-starting my career. Glad it's still being maintained to this day (despite it arguably getting steamrolled by unity and mismanagement).


That's not quite true. There were plenty of 90s game engines. They just weren't as general purpose. So you would end up rewriting a good chunk anyways to get what you wanted.


They managed to do this to Quake.

https://m.youtube.com/watch?v=SGknUiihpkI


id Tech engines, Renderware and a few other big ones were for sure available at the time and used. Earlier, during 90's was another situation however.


The Quake 3 engine was incredibly popular in the early 2000s.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: