Hacker Newsnew | past | comments | ask | show | jobs | submit | rwitts's commentslogin

This is a very interesting article.

A minor quibble - it is very difficult to find a paradigm in which the running time of the proposed algorithm differs from simply choosing the best of the obvious algorithms.

Line at a time takes O( max(w,h) ). Binary searching each row and column separately takes O( min(w,h) log(max(w,h) ). Therefore, implementing both basic algorithms and using the faster one at run-time will run in time O(min(max(w,h), min(w,h)log(max(w,h)).

Which is very close to the proposed run-time - unfortunately so much so that it is very hard to find a sequence of (w,h) for which the proposed algorithm works asymptotically better than simply doing "the obvious thing".

But it is possible! And the lower bounds are nice!


I (the author) actually did lose a lot of time trying to prove those weren't the same bound! Eventually I found a variable assignment that made it obvious:

    let w = n/log(n)
    let h = n

    (Note: I use ↑ and ↓ as binary operators for max and min.)
    (Note: @= means asymptotically equal. Analogous for @<.)
     
    'switching'
    @= (w ↑ h) ↓ ((w ↓ h) log(w ↑ h))
    @= (n/log(n) ↑ n) ↓ ((n/log(n) ↓ n) log(n/log(n) ↑ n))
    @= n ↓ (n/log(n) log(n))
    @= n

    'lower'    
    @= ((w ↓ h) log((w ↑ h)/(w ↓ h))
    @= n/log(n) log(n/(n/log(n))
    @= n log(log(n)) / log(n)
    @< n


Very nice! I'm glad that you worried about this and I have to admit that your asymptotic example is nicer than mine.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: