Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Jeremy from fast.ai here. I think it's tough to really substantiate this claim fully in a mainstream tech publication, without getting into details that are likely to be rather dull for much of the audience. However, I do think that the claim is a reasonable one, and there's more details to substantiate it in this article I wrote: http://www.fast.ai/2018/04/30/dawnbench-fastai/

Training small-ish datasets quickly is definitely not a toy problem. Most folks I know using neural nets in practice are using datasets of ~100MB. Some people do have giant datasets, but it's not the norm. And being able to train them quickly and cheaply is great for running experiments, as well as making the field more accessible to people with fewer resources.

If you're only interested in training larger datasets, this competition showed that the fastest Imagenet training on a single machine, and the fastest on publicly available infrastructure, also was done by fast.ai. The only better results were on TPU Pods, which are TPU machine clusters that are not publicly available. (It wouldn't be terribly hard to show similar results with a cluster of GPU machines, although it wasn't something that we have spent time on ourselves as yet.)

I strongly disagree with your claim that small groups can only compete with tech giants on stuff that tech giants aren't working on. There is a large amount of empirical evidence both now and throughout history that this isn't correct, and you have shown nothing to substantiate this claim. (I heard similar claims when Google first appeared: "a small group of Stanford researchers can't beat Yahoo.")



Well said, Jeremy!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: