2) I thought the issue was compilation from TS -> JS. So what's the issue? I remember live coding in JS like 15 years ago, have dev environments just gotten weird and convoluted?
No idea. My working theory is the gotta go fast folks at v8 don’t care about live hot reload because that doesn’t give them ad revenue, so they just wontfixed the whole thing and there is no alternative on the backend if you want typescript.
This doesn't make sense because you can definitely do live coding in Chrome devtools (which uses V8 of course) and in Nodejs (--watch will reload code changes automatically and there's ways to keep application state through code changes).
This is supposed to be used in place of CUDA, HIP, Metal, Vulkan, OpenGL, etc... It's targeting the hardware directly so doesn't need to be supported as such.
The site also seems to clearly state it's a work in progress. It's just an interesting blog post...
In many languages "Easter" is a variation of Passover. "Pâques" in French for example, Pascha in Greek and Russian, Pascua in most Latin languages. In others (aome Slavic languages) it's a variation of "Great Day" ("Velikonoce" in Czech for example). " And so on. English and German are the only languages I know with a weird word for Easter.
Edit - also Passover is Pesach in Hebrew. So Pesach -> Pascha in Greek, then onwards to other languages. Or just "Great Day" (which finishes off Great and Holy week which is what the week leading up to Easter is in most languages).
There's a lot of good stuff on the internet. More information than we had access to as kids. Better to raise them to be responsible, well adjusted humans than to shield them from reality.
Do they need access to that information at the age of 4 though? Almost certainly not. They don't even have basic reading proficiency until age around age 7 or 8. Kindergarten is still mostly focused on phonics.
This is completely wrong. Kids can easily learn to read at age 5. A child who is working on "basic reading proficiency" at 8 is very behind and has not been well-served by the people responsible for raising them.
AMD is appropriately valued IMO, Intel is undervalued and Nvidia is wildly overvalued. We're hitting a wall with LLMs, Nvidia was at one point valued higher than Apple which is insane.
Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.
Nvidia is valuably precisely because the software, which is also why AMD is not so valuable. CUDA matters a lot (though that might become less true soon). And Nvidia's CUDA/software forward thinking most certainly predated AGI FOMO and that is the CAUSE of them doing so well with this "AI boom".
It's also not wildly overvalued, purely on a forward PE basis.*
I do wonder about the LLM focus, specifically whether we're designing hardware too much for LLM at the cost of other ML/scientific computing workflows, especially the focus on low precision ops.
But..
1) I don't know how a company like Nvidia could feasibly not focus on designing for LLM in the midst of this craziness and not be sued by shareholders for negligence or something
2) they're able to roll out new architectures with great improvements, especially in memory, on a 2 year cycle! I obviously don't know the counterfactual, but I think without the LLM craze, the hypothetical generation of GPU/compute chips would be behind where they are now.
I think it's possible AMD is undervalued. I've been hoping forever they'd somehow catch up on software. They do very well in server business, and if Intel continues fucking up as much as they have been, AMD will own CPU/servers. I also think what deepseek has done may convince people it's worth it programming closer to the hardware, somewhat weakening Nvidias software moat.
*Of course, it's possible I'm not discounting enough for the geopolitical risk.
> It's also not wildly overvalued, purely on a forward PE basis.*
Once you start approaching a critical mass of sales, it's very difficult to keep growing it. Nvidia is being valued as though they'll reach a trillion dollars worth of sales per year. So nearly 10x growth.
You need to make a lot of assumptions to explain how they'll reach that, versus a ton of risk.
Risk #1: arbitrage principle aka. wherever there's profit to be made other players will move in. AMD has AI chips that are doing quite well, Amazon and Google both have their own AI chips, Apple has their own AI chips... IMO it's far more likely that we'll see commodification of AI chips than that the whole industry will do nothing and pay Nvidia's markup. Especially since TSMC is the one making the chips, not Nvidia.
Risk #2: AI is hitting a wall. VCs claim is isn't so but it's pretty obvious that it is. We went from "AGI in 2025" to AI companies essentially adding traditional AI elements to LLMs to make then useful. LLMs will never reach AGI, we need another technological breakthrough. Companies won't be willing to keep buying every generation of Nvidia chip for ever-diminishing returns.
Risk #3: Geopolitical, as you mentioned. Tariffs, China, etc...
Risk #4: CUDA isn't a moat. It was when no one else had the incentive to create an alternative and it gave everyone on Nvidia a head start. But now everything runs on AMD now too. Google and Amazon have obviously figured out something for their own accelerators.
The only way Nvidia reaches enough revenue to justify their market cap is if Jensen Huang's wild futuristic predictions become reality AND the Googles, Amazons, Apples, AMDs, Qualcomms, Mediateks and every other chip company all fail to catch up.
What I see right now is AI hitting a wall and the commodification of chip production.
I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.
Yet my AMD APU was never properly supported for hardware video decoding, and could only do up to OpenGL 3.3, while the Windows 10 driver could go up to OpenGL 4.1.
Of course. Most of the time I'm searching for a physical place, a companies' website, a product, or news. ChatGPT is terrible and giving any of those answers. It's rare I want to know some sort of random fact. ChatGPT also doesn't give sources like, say, Wikipedia.