Hacker Newsnew | past | comments | ask | show | jobs | submit | dismalaf's commentslogin

This doesn't implement all of Ruby. It's easy to make a language that looks like Ruby run fast. It's hard to make a CRuby compatible Ruby fast (all the dynamic features add a ton of overhead).

I've got a DeLonghi Dedica Duo which makes much better coffee than it has any right to... And it's very cheap.

Even can adjust temperature and shot water volume...


Acetaminophen and Ibuprofen are taken for different things and can be taken together as well, it's not a versus...

Ibuprofen is mostly for inflammation and Acetaminophen for fever and pain. Now there's overlap in that both work on headaches and some other kinds of pain but the main use case for each is different.


AMD Ryzen AI 400 is built on TSMC 4nm and Panther Lake is on Intel's 18A so Intel is literally a generation ahead for this product cycle and wins hands down...

But is the GPU good for anything? I'm used to Intel being completely crap and AMD actually being able to run games if you can live without 8k and 400 fps.

Also how's Linux support for either?


Intel GPUs have never been that bad... In the past they were simply small because the expectation was they'd be paired with a dGPU.

The Core series has GPUs on par with or even slightly bigger than most of the Ryzen AI series (look up benchmarks and articles).


> they were simply small

But that's bad in my book. I'm happy with 720p on low detail once in a while if it's a laptop. What I remember is Intel GPUS being unable to do even that.

If they caught up with AMD iGPUs, that's great. I don't do desktop replacement laptops, I prefer the ones I can hold in one hand, the dGPU is in my desktop.


> What I remember is Intel GPUS being unable to do even that.

Maybe in the days of HD graphics cards... I've got a Tiger Lake chipset and the Xe iGPU outperforms the laptop 3050 dGPU in actual games (due to having access to waaaaaay more RAM).


Well it's hard to erase a crap reputation.

During the previous chip crisis when covid and crypto were waning I needed a new desktop. Damned if I was going to buy a discrete video card at those prices so I went AMD integrated graphics. Didn't even stop to look at Intel.

(For the record said desktop has a discrete GPU now, but I bought it like 2 years after I built the desktop.)


Dunno, AMD (and ATI) had a poor reputation for years and only rehabbed it when they released the Zen architecture in 2017. Even then it took a few generations to be seen as better than Intel.

Intel has had a positive reputation the vast majority of the time from 1990 up to now, with only the last few years being bad.


> Intel has had a positive reputation

For integrated GPUs? Both sides were crap until Zen. I just didn't notice Intel caught up.

> from 1990 up to now

For CPUs they're fortunately still playing catch up with each other. You remember when AMD released Zen, but do you remember when Intel released the Core stuff and how crap Pentium 4 / Netburst was right before that?


I'm just saying that people buy the best thing and memories are short, no one really cares what brand their CPU is...

Forget fractional scaling, just keep scale at 100% and increase font and icon sizes.

Which is something like 2/3rds successful in my experience (I use this daily), and requires tons of fiddling to get things looking even mostly reasonable (lots of misalignments and funky padding otherwise). And lots of applications don't respect it and you're stuck with too-small controls when it fails. Which makes it a noticeably-worse success rate than fractional scaling, afaict.

I still use it because the end result on some of my most-used applications is nicer, and it seems to be slightly-noticeably better performing (on a high framerate screen). So it's good enough for my tastes. But it really isn't anything I'd call "successful".


> hamstring this thing with Intel.

Have you missed all the recent Intel news or something?


What are the news recently?

Well, Intel was kind of in the dumps because their process fell behind. They didn't bet on EUV and got leapfrogged by TSMC and Samsung who did use ASML's EUV technology.

They eventually got on the EUV train and were the first customer to receive ASML's current state of the art machine which they call high-NA EUV. Intel's 18A process is the first to use this machine as part of the manufacturing process, Panther Lake uses this process so now they're right back to being SOTA.

All the news about them (stock price movements, theories about them going bankrupt, Panther Lake, etc...) for the last 2 years has essentially been people betting on whether or not they can successfully incorporate SOTA ASML machines into their manufacturing.


Gotta be honest, I have. I'm still living in a world where AMD is superior, but that may not be the case today?

For laptops Lunar lake and Panther lake addressed many issues and brought x86 power consumption to Apple Silicon levels.

Is that true? Many other comments in this thread are saying Apple Silicon has something like 30% better efficiency/battery life.

Apple Silicon had a process node advantage over the Core 100 and 200 series due to having a better allocation with TSMC.

Now Intel's process node is also SOTA and on par with TSMC 2nm so they should be more or less equivalent and the only differences down to what set of compromises they make in the design of the chips.


I don't see why it wouldn't? I have a 16" MSI laptop with an 11th gen Intel processor (known for horrible battery life), I use Arch/Hyprland and it gets 5-6 hours with a battery degraded to 68%. Which is still in the ballpark of what most users said they got on Windows when this model was new.

Linux battery life is fine and on par with (or possibly better than) Windows these days if you don't do anything silly (I'm sure some distro and DE consume silly amounts of power just because, but it doesn't have to be that way).

Based on reports about Panther Lake, the new process, plus a 13" screen and large-ish battery, I believe the battery life claims.


No, because most of the estimates are wonky as hell. For one, calories from silage don't exactly translate directly to calories humans can make use of. Second, most estimates are worst case only and ignore the fact most animals are pastured for some/all of the year on marginal land. Some animals can survive entirely on foraging and waste from agriculture (pigs are a great example).

On the other side, not all climates can produce all the plants required for a balanced vegan diet. Here in Canada, nothing grows for 6 months and what does grow is relatively limited.

The lowest energy system would likely include a reduction in animal products but not a complete elimination, while keeping transportation to a minimum.

Also, just like with energy generation, there's the game theory aspect. If you reduce emissions, will everyone cooperate? What if you suffer only to have someone else increase their emissions anyway? We see this here... We limit our fisheries to try preserve ocean fish, only for Chinese vessels to sit on the edge of our borders hoovering up all the aquatic life...


Fn keys usually double as media keys so I use them a lot, as do most laptop users I know.

It'll increase the size of the case by a small amount but a battery cell is a battery cell... Rip open an old device and you'll see.

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: