Hacker News new | past | comments | ask | show | jobs | submit | more dismalaf's comments login

fonts.google.com has 1816 different font families that are all open-source. So no, I wouldn't steal a font when there's so many available for free.


There are many movies available for free


Some of them even use those free fonts https://www.youtube.com/watch?v=jVhlJNJopOQ


You could always use regular JS and have that...


1) over my dead body and 2) it doesn’t make sense for it to be possible in JS and not possible in TS.


2) I thought the issue was compilation from TS -> JS. So what's the issue? I remember live coding in JS like 15 years ago, have dev environments just gotten weird and convoluted?


No idea. My working theory is the gotta go fast folks at v8 don’t care about live hot reload because that doesn’t give them ad revenue, so they just wontfixed the whole thing and there is no alternative on the backend if you want typescript.


This doesn't make sense because you can definitely do live coding in Chrome devtools (which uses V8 of course) and in Nodejs (--watch will reload code changes automatically and there's ways to keep application state through code changes).

Edit - can also apparently do it with TS directly with Deno (also V8), here's an example: https://dev.to/craigmorten/how-to-code-live-browser-refresh-...


I’ll be very happy to be wrong here! Looking at these, thanks


I don't quite get this comment.

This is supposed to be used in place of CUDA, HIP, Metal, Vulkan, OpenGL, etc... It's targeting the hardware directly so doesn't need to be supported as such.

The site also seems to clearly state it's a work in progress. It's just an interesting blog post...


In many languages "Easter" is a variation of Passover. "Pâques" in French for example, Pascha in Greek and Russian, Pascua in most Latin languages. In others (aome Slavic languages) it's a variation of "Great Day" ("Velikonoce" in Czech for example). " And so on. English and German are the only languages I know with a weird word for Easter.

Edit - also Passover is Pesach in Hebrew. So Pesach -> Pascha in Greek, then onwards to other languages. Or just "Great Day" (which finishes off Great and Holy week which is what the week leading up to Easter is in most languages).


In Croatian, Easter is called Uskrs, meaning Resurrection, but Good Friday is called Veliki Petak, meaning Great Friday.


My bad, went by Google for that one (they listed Croation as similar to Czech, which I know a bit).


Same in russian - Velikaya Pyatnica, Great Friday.


In Norwegian and Danish it is påske and in Swedish it is påsk.


> unable to listen even 4-minute pop hits from start to finish

Maybe the kids have realised that those pop hits are repetitive and uninteresting...


Or their attention span is simply destroyed by social media and YouTube.

Plenty of pop music is excellent. But if you give it 30 seconds you will never find out if that’s true.


I can't decide if username checks out or not...


To be fair, 30 seconds of Bohemian Rhapsody is about the maximum I've ever been able to handle!


Even when watching Wayne's World?


I had no clue what that is. Had to Google it, I thought it was a video game. So, no :)


There's a lot of good stuff on the internet. More information than we had access to as kids. Better to raise them to be responsible, well adjusted humans than to shield them from reality.


Do they need access to that information at the age of 4 though? Almost certainly not. They don't even have basic reading proficiency until age around age 7 or 8. Kindergarten is still mostly focused on phonics.


> They don't even have basic reading proficiency until age around age 7 or 8.

Some develop advanced reading proficiency well before that, and the more they read the more likely that is to become the case.

Every child is different, but the more you assume they can't do certain things, the more you'll find that to be self-fulfilling.

(That said, that makes an argument for extensive access to reading materials, not videos.)


This is completely wrong. Kids can easily learn to read at age 5. A child who is working on "basic reading proficiency" at 8 is very behind and has not been well-served by the people responsible for raising them.


Yes, but still I would argue that books are the more appropriate reading material :)


Yeah, 100% with you on that.


Did you time travel from 2015 or something? Haven't heard of anyone having AMD issues in a very long time...


I’ve been consistently impressed with AMD for a while now. They’re constantly undervalued for no reason other than CUDA from what I can tell.


AMD is appropriately valued IMO, Intel is undervalued and Nvidia is wildly overvalued. We're hitting a wall with LLMs, Nvidia was at one point valued higher than Apple which is insane.

Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.


What?!

Nvidia is valuably precisely because the software, which is also why AMD is not so valuable. CUDA matters a lot (though that might become less true soon). And Nvidia's CUDA/software forward thinking most certainly predated AGI FOMO and that is the CAUSE of them doing so well with this "AI boom".

It's also not wildly overvalued, purely on a forward PE basis.*

I do wonder about the LLM focus, specifically whether we're designing hardware too much for LLM at the cost of other ML/scientific computing workflows, especially the focus on low precision ops.

But.. 1) I don't know how a company like Nvidia could feasibly not focus on designing for LLM in the midst of this craziness and not be sued by shareholders for negligence or something 2) they're able to roll out new architectures with great improvements, especially in memory, on a 2 year cycle! I obviously don't know the counterfactual, but I think without the LLM craze, the hypothetical generation of GPU/compute chips would be behind where they are now.

I think it's possible AMD is undervalued. I've been hoping forever they'd somehow catch up on software. They do very well in server business, and if Intel continues fucking up as much as they have been, AMD will own CPU/servers. I also think what deepseek has done may convince people it's worth it programming closer to the hardware, somewhat weakening Nvidias software moat.

*Of course, it's possible I'm not discounting enough for the geopolitical risk.


> It's also not wildly overvalued, purely on a forward PE basis.*

Once you start approaching a critical mass of sales, it's very difficult to keep growing it. Nvidia is being valued as though they'll reach a trillion dollars worth of sales per year. So nearly 10x growth.

You need to make a lot of assumptions to explain how they'll reach that, versus a ton of risk.

Risk #1: arbitrage principle aka. wherever there's profit to be made other players will move in. AMD has AI chips that are doing quite well, Amazon and Google both have their own AI chips, Apple has their own AI chips... IMO it's far more likely that we'll see commodification of AI chips than that the whole industry will do nothing and pay Nvidia's markup. Especially since TSMC is the one making the chips, not Nvidia.

Risk #2: AI is hitting a wall. VCs claim is isn't so but it's pretty obvious that it is. We went from "AGI in 2025" to AI companies essentially adding traditional AI elements to LLMs to make then useful. LLMs will never reach AGI, we need another technological breakthrough. Companies won't be willing to keep buying every generation of Nvidia chip for ever-diminishing returns.

Risk #3: Geopolitical, as you mentioned. Tariffs, China, etc...

Risk #4: CUDA isn't a moat. It was when no one else had the incentive to create an alternative and it gave everyone on Nvidia a head start. But now everything runs on AMD now too. Google and Amazon have obviously figured out something for their own accelerators.

The only way Nvidia reaches enough revenue to justify their market cap is if Jensen Huang's wild futuristic predictions become reality AND the Googles, Amazons, Apples, AMDs, Qualcomms, Mediateks and every other chip company all fail to catch up.

What I see right now is AI hitting a wall and the commodification of chip production.


Not really. I don't want to just re-paste everything, but basically this: https://news.ycombinator.com/item?id=43688088 where I also sort of address your 2015 mention here.


Ah, Windows OEM nonsense...

I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.


Yet my AMD APU was never properly supported for hardware video decoding, and could only do up to OpenGL 3.3, while the Windows 10 driver could go up to OpenGL 4.1.


Weird. Was it pre-Zen?

I had a Ryzen 2700u that was fully supported, latest OpenGL and Vulkan from day 1, hardware decoding, etc... but on Linux.


Nice to see, considering Crystal 1.0 didn't really have anything in the way of parallelism...


Of course. Most of the time I'm searching for a physical place, a companies' website, a product, or news. ChatGPT is terrible and giving any of those answers. It's rare I want to know some sort of random fact. ChatGPT also doesn't give sources like, say, Wikipedia.


Not sure about HTMX specifically, but I've used it with Rails/Hotwire/Stimulus (similar conceptually) and it works great.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: