See the funny thing is, even with all of this stuff about Intel that I hear about (and agree with as reported), I also just committed a cardinal sin just recently.
I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.
The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.
And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.
Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.
It's a bummer you've had poor experiences with ATI and later AMD, especially on a new system. I have an AMD laptop with Ryzen 7 7840U which includes a Radeon 780M for integrated graphics and it's been rock solid. I tested many old and new titles on it, albeit at medium-ish settings.
Built a PC with a top-of-the line AMD CPU, it's great. AMD APUs are great in dedicated gaming devices like the XBOX ONE, PS 4 and 5 and Steam Deck.
On the other hand I still think of Intel Integrated GPU in "that thing that screws up your web browser chrome of if you have a laptop with dedicated graphics"
AMD basically stopped supporting (including updating drivers) for GPUs before RDNA (in particular GCN), while such GPUs were still part of AMD's Zen 3 APU offerings.
Well back when, literally 25 years ago, when it was all ATI, there were constant driver issues with ATI. I think it's a pretty well known thing. At least was back when.
I did think that given ATI was bought out by AMD and AMD itself is fine it should be OK. AMD always was. I've had systems with AMD CPUs and Nvidia GPUs back when it was an actual desktop tower gaming system I was building/upgrading myself. Heck my basement server is still an AMD CPU system with zero issues whatsoever. Of course it's got zero graphics duties.
On the laptop side, for a time I'd buy something with discrete Nvidia cards when I was still gaming more actively. But then life happened, so graphics was no longer important and I do keep my systems for a long time / buy non-latest gen. So by chance I've been with Intel for a long time and gaming came up again, casually. The Intel HD graphics were of course totally inadequate for any "real" current gaming. But I found that replaying some old favs and even "newer" games I had missed out on (new as in, playing a 2013 game for the very first time in 2023 type thing) was totally fine on an Intel iGPU.
So when I was getting to newer titles, the Intel HD graphics no longer cut it but I'm still not a "gamer" again, I looked at a more recent system and thought I'd be totally fine trying an AMD system. Exactly like another poster said, "post 2015 should be fine, right?! And then there's all this recent bad news about Intel, this is the time to switch!".
Still iGPU. I'm not going to shell out thousands of dollars here.
And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at. I doctored around, installed the latest AMD Adrenalin driver, played around with brightness, contract, HDR, color balance, tried to disable the Vari-Brightness I read was supposed to be the culprit etc. It does get worse once you get into a game. Like you're in Windows and it's bearable. Then you start a game and you might Alt-Tab back to do something and everything is just awfully weirdly bright and it doesn't go away when you shut down the game either.
I stuck with it and kept doctoring for over 6 months now.
I've had enough. I bought a new laptop, two generations behind with an Intel Iris Xe for the same amount of money as the ATI system. I open Windows and ... everything is entirely totally 150% fine, no need to adjust anything. It's comfortable, colors are fine, brightness and contrast are fine. And the performance is entirely adequately the same as with the AMD system. Again, still iGPU and that's fine and expected. It's the quality I'm concerned with, not the performance I'm paying for. I expect to be able to get proper quality software and hardware even if I pay for less performance than gamer kid me back when was willing to.
It's Lenovo. FWIW, one thing I really didn't like much either was that I found out that AMD really tries to hide what actual GPU is in there.
Everything just reports it as "with Radeon graphics", including benchmarking software, so it's almost impossible to find anything about it online.
The only thing I found helped was GPU-Z. Maybe it's just one of the known bad ones and everything else is fine and "I bought the one lemon from a prime steak company" but that doesn't change that my first experience with the lemon company turned prime steak company is ... another lemon ;)
It's a Lucienne C2 apparently. And again, performance wise, absolute exactly as I expected. Graphics quality and AMD software? Unfortunately exactly what I expected from ATI :(
And I'm not alone when I look online and what you find online is not just all Lenovo. So I do doubt it's that. All and I mean all my laptops I'm talking about here were Lenovos. Including when they were called IBM ThinkPads and just built by Lenovo ;)
Laptops have really gone to hell in the past few years. IMO the only sane laptop choices remaining are Framework and Apple. Every other vendor is mess, especially when it comes to properly sleeping when closing the lid.
I bought an AMD Ryzen Thinkpad late last year, and I had the same issue with bright/saturated colours. I fixed it by running X-Rite Color Assistant which was bundled with the laptop, and setting the profile to sRGB. I then turned up the brightness a little.
I think this a consequence of the laptop having HDR colour, and the vendor wanting to make it obvious. It's the blinding blue LED of the current day.
Yeah, I read HDR might be the issue. Didn't know X-Rite and did not come with the laptop, but did play with disabling / trying to adjust HDR, making sure sRGB was set etc. Did not help. Also ran all the calibrations I could find for gamma, brightness and contrast many many times to try and find something that was better.
What I settled on for quite some time was manually adjusted color balance and contrast and turning the brightness down. That made it bearable but especially right next to another system, it's just "off" and still washed out.
If this was HDR and one can't get rid of it, then yeah agreed, it's just bad. I'm actually surprised you'd turn the brightness up. That was one of the worst things to do, to have the brightness too high. Felt like it was burning my eyes.
If the diagnosis is that AMD GPUs can't do HDR properly then yes. There was not a single setting anywhere in Windows itself nor the Adrenalin driver software that allowed me to configure the screen to a comfortable setting. Even when specifically trying to disable anything HDR related.
My work Macbook on the other hand has zero issues with HDR and its display.
To be fair, you can still blame the OEM of course but as a user I have no way to distinguish that, especially in my specific situation.
I think I found X-Rite by just searching for color with the start menu.
Before I used that tool, I tried a few of the built-in colour profiles under the display settings, and that didn't help.
I had to turn the brightness up because when the display is in sRGB it gets dimmer. Everything is much more dim and muted, like a conventional laptop screen. But if I change it back to say, one of the DICOM profiles, then yeah, torch mode. (And if I turn the brightness down in that mode, bright colours are fine but dim colours are too dim and everything is still too saturated).
AMD is appropriately valued IMO, Intel is undervalued and Nvidia is wildly overvalued. We're hitting a wall with LLMs, Nvidia was at one point valued higher than Apple which is insane.
Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.
Nvidia is valuably precisely because the software, which is also why AMD is not so valuable. CUDA matters a lot (though that might become less true soon). And Nvidia's CUDA/software forward thinking most certainly predated AGI FOMO and that is the CAUSE of them doing so well with this "AI boom".
It's also not wildly overvalued, purely on a forward PE basis.*
I do wonder about the LLM focus, specifically whether we're designing hardware too much for LLM at the cost of other ML/scientific computing workflows, especially the focus on low precision ops.
But..
1) I don't know how a company like Nvidia could feasibly not focus on designing for LLM in the midst of this craziness and not be sued by shareholders for negligence or something
2) they're able to roll out new architectures with great improvements, especially in memory, on a 2 year cycle! I obviously don't know the counterfactual, but I think without the LLM craze, the hypothetical generation of GPU/compute chips would be behind where they are now.
I think it's possible AMD is undervalued. I've been hoping forever they'd somehow catch up on software. They do very well in server business, and if Intel continues fucking up as much as they have been, AMD will own CPU/servers. I also think what deepseek has done may convince people it's worth it programming closer to the hardware, somewhat weakening Nvidias software moat.
*Of course, it's possible I'm not discounting enough for the geopolitical risk.
> It's also not wildly overvalued, purely on a forward PE basis.*
Once you start approaching a critical mass of sales, it's very difficult to keep growing it. Nvidia is being valued as though they'll reach a trillion dollars worth of sales per year. So nearly 10x growth.
You need to make a lot of assumptions to explain how they'll reach that, versus a ton of risk.
Risk #1: arbitrage principle aka. wherever there's profit to be made other players will move in. AMD has AI chips that are doing quite well, Amazon and Google both have their own AI chips, Apple has their own AI chips... IMO it's far more likely that we'll see commodification of AI chips than that the whole industry will do nothing and pay Nvidia's markup. Especially since TSMC is the one making the chips, not Nvidia.
Risk #2: AI is hitting a wall. VCs claim is isn't so but it's pretty obvious that it is. We went from "AGI in 2025" to AI companies essentially adding traditional AI elements to LLMs to make then useful. LLMs will never reach AGI, we need another technological breakthrough. Companies won't be willing to keep buying every generation of Nvidia chip for ever-diminishing returns.
Risk #3: Geopolitical, as you mentioned. Tariffs, China, etc...
Risk #4: CUDA isn't a moat. It was when no one else had the incentive to create an alternative and it gave everyone on Nvidia a head start. But now everything runs on AMD now too. Google and Amazon have obviously figured out something for their own accelerators.
The only way Nvidia reaches enough revenue to justify their market cap is if Jensen Huang's wild futuristic predictions become reality AND the Googles, Amazons, Apples, AMDs, Qualcomms, Mediateks and every other chip company all fail to catch up.
What I see right now is AI hitting a wall and the commodification of chip production.
I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.
Yet my AMD APU was never properly supported for hardware video decoding, and could only do up to OpenGL 3.3, while the Windows 10 driver could go up to OpenGL 4.1.
That's actually something I have not tried at all again yet.
Back in the day, w/ AMD CPU and Nvidia GPU, I was gaming on Linux a lot. ATI was basically unusable on Linux while Nvidia (not with the nouveau driver of course), if you looked past the whole kernel driver controversy with GPL hardliners, was excellent quality and performance. It just worked and it performed.
I was playing World of Warcraft back in the mid 2000s via Wine on Linux and the experience was actually better than in Windows. And other titles like say Counter Strike 1.5, 1.6 and Q3 of course.
I have not tried that in a long time. I did hear exactly what you're saying here. Then again I heard the same about AMD buying ATI and things being OK now. My other reply(ies) elaborate on what exactly the experience has been if you're interested.
I wish I had an AMD card. Instead our work laptops are X1 extremes with discrete nvidia cards and they are absolutely infuriating. The external outputs are all routed through the nvidia card, so one frequently ends up with the fan blowing on full blast when plugged into a monitor. Moreover, when unplugging the laptop often fails to shutdown the discrete graphics card so suddenly the battery is empty (because the discrete card uses twice the power). The Intel card on the other hand seems to prevent S3 sleep when on battery, i.e. the laptop starts sleeping and immediately wakes up again (I chased it down to the Intel driver but couldn't get further).
And I'm not even talking about the hassle of the nvidia drivers on Linux (which admittedly has become quite a bit better).
All that just for some negligible graphics power that I'm never using on the laptop.
I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.
The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.
And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.
Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.