Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Distributed training is indeed a thing! A few random Arxiv pulls:

https://arxiv.org/abs/2007.03970 https://arxiv.org/abs/1802.09941

Efficiency and resource cost is the big question though. You don't pay for the electricity or part wear that you don't use, and home computers or workstations may not be as efficient at performing a training run vs a task-specific setup. AI@home might end up costing even more, and increase the footprint of the model more, than doing it all together.

Part of the magic really needed is finding simpler ways to achieve the same levels of model robustness.



> electricity or part wear

Sure part wear is relevant but I feel that most parts worldwide gets chucked way before they are worn out. Electric efficiency is probably quite worse though. Although you could possibly find opportunity in maximizing the load in regions that have surplus energy and/or renewable sources.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: