Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GeForce GPUs aren't "allowed" to be used in servers so you "have to" buy the 10x more expensive ones.


Yeah true some configs are certified for servers and GeForce aren’t. Xeon’s in the same boat. So it does depend on your goals and requirements then. Using the V100 is still cherry picking in a price/perf argument since you don’t have to spend 10x, there are others cheaper options that are server qualified, right?


Yeah, but I assume all the server GPUs are 10x worse than consumer.


Huh. Okay. Why? What do you mean by ‘worse’?


If a $10K server GPU is equivalent to a $1K consumer GPU, I assume the $2K server GPU is equivalent to a $200 consumer one. If the price/performance sucks, picking a different model won't help.


I see. Well picking a different model actually does help, a lot, so the main thing to consider when asking whether your assumptions are valid is whether the $10k GPU and $1k GPU are equivalent (they’re not), and what you’re paying for, because it’s not primarily for flops. Take the 2 models of GV100 for example that have exactly the same perf, and yet one is half the price of what @majke picked as the example. In this case, picking a different model helps price by 2x. The difference is memory size. Other non-perf differences that affect price include thermal properties, support level, and generation of GPU. These things come down to your goals and requirements. Maybe @majke didn’t check recently but you can buy newer GPUs than a GV100 that has even more memory, higher perf, is server certified, and costs about half, so even using the half-price the smaller GV100 would be cherry picking in my book. And if we’re talking about consumer hobbyist needs and not server farm needs, that’s when you can get a lot of compute for your money.


Thanks @wmf @dahart for the discussion.

You are both right:

- I can't just buy 3080 and stuff it into my servers due to legal.

- I can't just buy 3080 and stuff it into my servers due to availability.

- Often (as the example I given) the price-to-performance of GPU is not worth the cost of porting software.

- Often (as the example I given) the price-to-performance of GPU is not super competitive with CPU.

- Sometimes, you can make the math work. Either by picking a problem which GPU excels at (memory speed, single precision, etc), or by picking consumer grade GPU or by having access to cheap/used datacenter grade GPU's.

- In the example I given, even with cheap 3080, and say 20-30x better perf/dollar ratio of GPU's.... is it still worth it? It's not like my servers are calculating euclidean distance for 100% their CPU. The workload is diverse, nginx, dns, database, javascript. There is a bit of heavy computation, but it's not 100% of workload. In order to get GPGPU to pay for itself it would need to take over a large portion of load, which, in general case is not possible. So, I would take GPU into consideration if it was 200x-1000x better per dollar then CPU, then I could make a strong financial argument.

The point I was trying to make, is that GPU's are a good fit for a small fraction of computer workloads. For them to make sense:

- more problems would need to fit on them

- or the performance/dollar would need to improve further by orders of magnitude




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: