Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just reasoning about this from first principles, but intuitively, the more dimensions you have, the more likely that you are to find a gradient in some dimension. In an N-dimensional space, a local minimum needs to be a minimum in all N dimensions, right? Otherwise the algorithm will keep exploring down the gradient. (Not an expert on this stuff.) The more dimensions there are, the more likely it seems to be that a gradient exists down to some greater minimum from any given point.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: