Hacker News new | past | comments | ask | show | jobs | submit login

> The other obstacle is that any optimization is considered evil anyway

What? By whom? With all due respect to whoever told you that, no, no, and definitely no. Are you referring to Knuth’s famous quote, or something else? Knuth was advocating for optimization, not against it. He did tangentially and jokingly warn against premature optimization when writing software, but either way, neither using a GPU nor making architectural decisions counts as premature optimization unless you do it without thinking about your problem at all (and in that case, using a CPU is also premature optimization). Using a GPU is no different from using a faster CPU as a hardware solution to a compute problem, if you are doing SIMD-compatible work, and the GPU will be lots less expensive in terms of flops. Plus using the GPU they already have is loads cheaper for the users than buying a faster CPU.

> because they would be more or less infeasible, not just slower, with just CPU.

That’s a strange way to frame it. It’s infeasible because it’s slower, so much slower people can’t tolerate it. This is precisely why people should and do look to GPGPU for math and compute intensive applications.




I'm not expecting anyone to know what Knuth actually wrote, but the comment about premature optimization is quoted a lot out of context, usually with "premature" left out. Whether it was a joke or not, it's now taken as gospel.

I find it really odd that you have never run into this attitude. I've experienced it in almost every job I've ever had. I'm fairly sure that Scrum also considers thinking about your problem at all wrong, because it's not an user story.

> Plus using the GPU they already have is loads cheaper for the users than buying a faster CPU.

Actual quote from a former CTO: "The client likes that our software has so high hardware requirements, because it makes them think it does something really difficult"

> It’s infeasible because it’s slower, so much slower people can’t tolerate it.

Yes, this is exactly what I meant.


> I find it really odd that you have never run into this attitude.

I didn’t say that, don’t make assumptions. I’ve run into people misquoting Knuth a lot, above it seemed like you were one of them. If you know it’s a misquote, please refrain from forwarding misinformation. The argument that a lot of people misuse the quote or misunderstand it is in no way compelling as a reason to believe or to spread the wrong version.

I haven’t heard a lot of people saying any and all optimization is bad, even after profiling. That’s just completely and utterly silly and now I know you agree and know it too. Our jobs as programmers is partly to help other programmers and managers and customers to see their choices more clearly, not to just amplify their misconceptions, right?

Your CTO quote about how to trick customers isn’t relevant to what we were discussing, and on top of that it effectively supports the idea of using GPUs, albeit for the wrong reasons.


> I didn’t say that, don’t make assumptions.

In that case I don't really understand why you had to ask who says that.

> I haven’t heard a lot of people saying any and all optimization is bad, even after profiling. That’s just completely and utterly silly and now I know you agree and know it too.

After profiling is too soon. The right time is when you're in deep trouble and can't explain your way out any more. Again, not really my opinion but I've encountered this a bit too many times.


> The right time is when you’re in deep trouble

I mean, I don’t agree with that and neither does Knuth, and it sounds like neither do you, but hey, it’s not up to me to tell anyone other than my own team how to write code or run a business. There are always going to be dev jobs that just don’t care about performance until they’re in trouble and wasting money and time. If what you’re saying is that some dev shops don’t want to consider GPU programming because they don’t care about performance at all, and it would be hard to propose CUDA as a solution there due to the prevailing attitudes, then yeah I agree that’s not the place to learn CUDA. I’d have to agree that’s an obstacle to learning CUDA, but that’s not really CUDA’s fault or a general obstacle to using GPUs, it’s just the wrong place & time. Better to find a job at a company that cares about performance, right? There are lots of them.

FWIW, your phrasing makes it sound like you do hold this opinion, which is why I asked about who believes this. You’re stating it both here and above first as though it’s a general fact before later qualifying it’s someone else’s belief and slightly distancing yourself. I still can’t tell where you really land, but hopefully we’re more violently agreeing that disagreeing. All I’m saying is it would be doing your peers and HN both a service to challenge misinterpretations and misunderstandings of what Knuth was trying to get across, that performance matters (and also that efficient use of your time matters too).


> There are lots of them.

I need names. Especially those that don't require a long history of GPU programming or PhD in a related field to even get an interview. Bonus points if they're not a startup that is about to fail and is desperate to hire anyone who wants work on cool stuff for free while it lasts. Even better if they have a business model that is not cryptocurrency, HFT or just hoping to get acquired.

Yes, I'm more than a bit disillusioned with the field. We could do much better if there hadn't been some people who made a lot of money on the "move fast and break things" or "nobody got fired for buying X" attitudes. I was trying to communicate those things as if they were commonly accepted attitudes but not really true. I think I failed. Sarcasm never works on the Internet.


I see, I hear you. Well, all the FAANG companies have high performance dev groups that dabble with GPU. Games companies all do GPU, plus any team doing graphics. Neural network jobs at AI companies are around but more likely to require the history & PhD, and land in the startup category. First place I did any GPU work for my job was WebGL at LucidChart (a web app competitor to Visio). AMD, Apple, Sony, Nvidia and anyone else making GPU or SIMD units have tons of jobs and you can likely start with something close to what you currently do and transition toward more high performance programming. I dunno if that’s at all helpful, but sorry about the disillusionment, I know that can be difficult to deal with.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: