Which buyers exactly? Enthusiasts seem to be switching overwhelmingly to AMD: e.g. Mindfactory having AMD at 81% now. [0] Anecdotally, there seems to be a lot of resentment towards Intel for their shenanigans, I don't personally know anyone who's even considering Intel for their own builds.
It's a different matter for laptops where consumers have much less say on the topic, of course.
Now that a large segment has swapped to AMD for performance they're also going to see the light about not getting screwed by the cpu socket changing in intels platforms.
Google's planning on using semi-custom designs for their new gaming platform. AWS is already ordering and using tons of server chips and Google doesn't seem too far behind.
Google is also looking at AMD for it servers. As AMD is beating Intel with the current generation so badly in all cost metrics that using Intel is wasting money.
Google has been "looking at" AMD for decades, including rolling out an enormous fleet of AMD machines spanning several generations. They have a big platforms group and plenty of engineers to dedicate to trying to make the TCO pencil for any platform, even POWER and ARM. The existence of a new AMD part that could actually work is not a new development in the relationship between Google and Intel.
Even if AMD is used just as a leveraging stick against Intel, that's a win for customers.
There's the old story of talking to an Oracle salesman while drinking from an IBM coffee mug - to signal that there's other viable options and get a better deal.
The real problem is that AMD doesn't invest enough in the software.
Nvidia's Linux support absolutely puts AMD to shame.
The biggest lesson in hardware is that "it's the software support, stupid". Too many hardware manufacturers think they are selling hardware alone. They are selling hardware AND the software that makes it work.
A computer is useless without software support. Witness the vast array of essentially useless ARM devices out there because none of them have usable software drivers/Linux support.
Nvidia's Linux support? AMD actually has open source drivers these days, just like Intel, Nvidia just gives you a binary module, which means in order to make use of that hardware you have to taint your kernel and aren't free to update it if the in-kernel ABI changes. It's basically on par with the vendors of all those "useless ARM devices out there", none of them with "usable Linux support".
AMD is finally approaching parity in the past year or two, but Nvidia is still ahead in terms of day-to-day usability. It sucks that they're closed, but if you just want to install a driver and have your system work, Nvidia is ahead.
Look at this mess: https://wiki.archlinux.org/index.php/Xorg#AMD Do you want AMDGPU, AMDGPU PRO, ATI, or Catalyst? How do you choose? Let's assume you want open source and recent, so probably AMDGPU. OK that's relatively straightforward, but now let's say you want Vulkan support. That's over here: https://wiki.archlinux.org/index.php/Vulkan#Installation and golly wouldn't you know it there are three choices, vulkan-radeon, amdvlk, and vulkan-amdgpu-pro. How do those relate to the radeon and amdgpu drivers mentioned earlier? Arrrrrrrrrrgh.
Meanwhile on nvidia, you install "nvidia" and you're done.
I've been working in gaming on Linux profesionally for ten years. If I were buying a new GPU today, I would pick nvidia.
I think you might be blinded by prior experiences (yes, that sounds illogical). The choice you have with AMD now is not bad for users, they get a working good and free driver with their distro out of the box. And people like you, who work in gaming under Linux and might use a more custom distro, can figure out which one is the best driver in your specific situation (it's AMDGPU always anyway).
I'd never buy a Nvidia card for Linux now. I wouldn't have done so three years ago, but now? That'd be a huge step backwards.
It's possible. The last time I bought an AMD GPU for my personal use was in 2009 or 2010. But I really do work with this stuff every day. In our QA lab, the AMD machines have far more problems than the Nvidia machines do. Valve's VR software works much less consistently; installing modern versions of the drivers on Ubuntu is a huge pain, you have to use 3rd party repos maintained by some random guy; missing features and driver bugs are much more common on AMD (though not unheard of on Nvidia either).
AMD is definitely getting better, and being open source is an enormous point in its favor, but it's still just less usable day-to-day for an end user. I hope this changes.
Unfortunately very few Linux people even see a point in actually using their GPU. As long as it can composite terminal emulators, they think everything is great. I've seen such reasoning in discussions about sway, where people are encouraged to throw out their $500+ GPU in favor of the on-die Intel GPU...
...Vulkan? Proton? In terms of 3D acceleration libraries and support, its never been better on linux. I actually just switched to manjaro full-time because everything I wanted to play now works on linux, too.
Mesa drivers are fine.
The one with horrid performance is their proprietary AMDGPU-PRO, but there's no reason to use it. (especially since Mesa now support GL compatibility profile)
Mesa drivers are far from fine, they only got rid of the "core only" braindead schism just last year (IIRC) and that was thanks to pressure from AMD to get games working. Even then a lot of Mesa's codepaths are downright awful.
Nvidia had OpenGL as a major focus for decades (it was even the first API to get new features introduced in their hardware via extensions - though nowadays they also seem to do the same with Vulkan) and their OpenGL implementation's quality shows.
Yes, Mesa is not perfect. (particularly annoying thing is that it is possible to lock up entire GPU by invalid memory accesses in shader, infinite loops not always recover cleanly and lock up GPU too, and these bugs are accessible through WebGL, etc.)
But I prefer occasional crash in badly behaved software than abysmal performance of their proprietary implementation. (and it crashes too, usually on software using legacy pre-GL3 context).
And as to compatibility context, IMO OpenGL shouldn't have defined it at all. It is rather weird to have rendering code using both legacy fixed pipeline and modern shaders. From my experience it just causes problems everywhere, and I had encountered unexpectedly bad performance and glitches on Nvidia drivers too. (though maybe less often than on Intel/AMD)
My experience with NV on Linux is out of date, from 2003-ish (?) when the binary driver was new, to 2010, but I remember a few drawbacks I couldn't overcome and happily made the switch to ATI/AMD:
1) Closed driver
2) Weird incompatible multimonitor support called twinview? Where windows would maximize to stretch across all your screens
3) Abusive on the PCI subsystem, so that my sound hardware would experience underruns, both spurious and reproducible events such as during workspace switches
4) Lockups
5) Corruption/lockups switching between FB terminals and the X terminal
I switched to ATI and the open source driver and all these problems went away.
> which means in order to make use of that hardware you have to taint your kernel and aren't free to update it if the in-kernel ABI changes.
I have difficulty placing all the blame on Nvidia here since it is the kernel developers long-held disdain for stable driver ABIs that Nvidia feels forces their hand. There are those who argue otherwise, but that argument is basically "all software should be open source". Even if we go with the more reasonable "all drivers should be open source" that still introduces problems because of bonkers patent law nonsense.
AMD doesn’t support 8K monitors under Linux. I agree, NVIDIA has far better usability for most users. The fact that it’s a binary blob means nothing to anyone who isn’t an OSS activist.
> aren't free to update it if the in-kernel ABI changes.
How often is that actually an issue? Last time I ran into that was when I was still running Debian Sid and NVIDIA was far from the only thing that ended up breaking.
I'm sorry but NVidia Linux support is really bad. I administer a few dozen machines and am really looking forward to tossing the NVidia cards into the dumpster where they belong.
NVidia Linux support has always been bad, we just didn't realize it because AMD's used to be worse. Luckily this has changed over the past few years and now we can spec in AMD or perhaps Intel's rumored Xe graphics card.
> I'm sorry but NVidia Linux support is really bad. I administer a few dozen machines and am really looking forward to tossing the NVidia cards into the dumpster where they belong.
I assume you're joking, but if not please don't junk them! I'm sure lots of people on HN would gladly pay for (at least) shipping.
Yes, it was hyperbole. The "dumpster" is an area in the office where employees can grab surplus hardware and supplies -- it'll be picked clean pretty quick if video cards show up there...
>Nvidia's Linux support absolutely puts AMD to shame.
Huh? AMD "just works". NVIDIA requires a dance of (Fedora) DKMS kernel modules and binary blobs. What am I missing? I buy AMD just because it's been frictionless to get working, I don't mind if theres a small performance penalty I pay for it.
This is repeated frequently, but I was never able to get my AMD 560 to work on Linux at 4k with MST (or turn down the fan speed). With Nvidia I installed the binary blob and 4k over MST “just worked”. I tried the amdgpu and Pro versions of the drivers.
Neither company is perfect with Linux support. As with most things, there are pros and cons to both sides. But for anyone else reading here, just because you pick AMD does not mean everything is magically trouble free on Linux.
Does MST even work 4k@60 over DP? 4k@60 requires so much bandwidth, that is uses both MST channels (it is internally handled as two monitors); if you want to use both channels separately, you are stuck with 4k@30 at most.
Yes, MST absolutely can do 4k at 60hz. This is what the first Dell 32" 4k used.
My rMBP has been able to do this since February, 2014. I was trying to get my RX 560 in mid 2018 to do this and it just was not possible via any means with Ubuntu 18.04. I ended up using an Nvidia 1040 with the binary drivers without issue. The nouveau driver for the 1040 was able to do it with a newer kernel than Ubuntu 18.04 had (I was using Arch for that test).
To get two 4k@60 outputs, I have to run two cables (it has two TB2 ports). With daisy-chaining, I got mirroring at best (I didn't check refresh rate at the time, though).
I've never tried MST on AMD so I cannot comment. But of course it goes without saying that by picking NVIDIA it won't be magical and trouble free for everyone.
AMD's Linux support is terrific now. Nvidia is much more of a hassle and also impeding progress by insisting on their own window system - driver interface for Wayland.
I'm pretty sure neither manufacturer is really that worried about their Linux support. And, as far as my experience has gone with them both on Linux, they have both had their good years and their bad.
This is exactly true. Especially on CUDA and the deep learning front. Basically, only option is NVIDIA as the competition has almost nonexistent software framework (maybe OpenCL).
CUDA is a vendor lock-in scheme. Use OpenCL or Vulkan instead (yes, Vulkan includes support for compute, not just graphics!). AMD supports both, in addition to tools like HIP to help you port legacy CUDA code.
CUDA is result of being the first mover in an emerging field over a decade ago. It’s a lock in scheme the way Excel is a lock in scheme for Microsoft: a product that provides value which is used to generate revenue.
Using Vulkan(?) or OpenCL for deep learning is fine if your goal is helping get out the word that it’s at least possible to do so.
It’s not a great idea if you simply want to get your work done with a minimal amount of issues and with up to date support of all the prominent packages.
Both are valid objectives.
HIP won’t help you one bit if you use one of the many CUDA libraries.
"The prominent packages" for deep learning are all open source. If they don't support OpenCL or Vulkan yet, you can work on it and contribute the feature upstream. It's a matter of getting the ball rolling to start with, nothing more than that.
> If they don't support OpenCL or Vulkan yet, you can work on it and contribute the feature upstream. It's a matter of getting the ball rolling to start with, nothing more than that.
And that’s exactly what I mean with the choice that you have to make.
You could do that, and win brownie points with some crowd, but let’s not pretend that doing so wouldn’t have an impact on your work as an AI researcher.
“Nothing more than that” is more than a little bit disingenuous when you take into account the actual work that involved, don’t you think?
It’s something a company like AMD could (and does?) sponsor by throwing paid employees at it. Or big companies like Alibaba who buy enough HW to trade off HW cost vs R&D spending.
But if your goal is to be a deep learning specialist (it’s where the money is today), spending time on these kind of infrastructure issues is a complete waste.