Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't this suggest that computing 2^i is just as hard?

You can compute g^(2^i) by just multiplying g's in the same way that you can compute 2^i by adding 1s, and multiplication is as associative as addition is. Specifying that you want g^(2^i) rather than g^k for some arbitrary k looks like a way of ensuring that a partial product computed in parallel (say, g^3) will be of nearly no value, because every partial product that helps compute the total will already be produced by the single agent doing squares.

So why compute g^(2^i) rather than just 2^i? Is 2^i in fact hard to compute? (In binary representation, it certainly isn't, but maybe 3^i is?)



Using the doubling trick you can compute `2^i` in `log i` time, so you can compute `2^(2^i)` in `log 2^i = i` time. Let `i` be big enough, and you've got a "difficult" problem.'




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: