Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Having a few hundred consumer GPUs or a few dozen "datacenter" GPUs should be within the reach of any University department"

That was funny - however not even close to reality. I have to work on a GTX 1080 (not TI)...




A month ago Nvidia had a grant program running to get rid of refurbished^w^w^w^w donate 1-4 Titan Vs based on a 1-2 page application [1]. When my university started offering a CUDA course we got ~15 top of the line GTX cards sponsored by Nvidia. Buying 100 GTX1080TI with 11GB with supporting hardware is in the range of 100 thousand Euro/USD (before applying education discounts and asking for sponsorships). Not money spent on a dime, but not outrageously expensive either (the article mentions OpenAI spending millions on cloud GPU resources, compared to that spending 100k on something you get to keep is nothing)

[1] https://developer.nvidia.com/academic_gpu_seeding


Why can't they afford it? I remember when I worked at a physics lab at University and each team had many pieces of equipment each costing more than $100k. You'd get quite a lot of compute for that kind of money, especially since AI research doesn't need any other equipment.


I get you can’t do all your work on them, but 24 hours of 16 GPUs is $215, using the newest instance type on-demand.

It’s within the reach of many grants to afford a few scaled runs of a technology as a demonstration of behavior at scale.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: