Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With probabilistic algorithms, you still have the fundamental limitation that there's a lower-bound to how fast an algorithm can run. You can't keep getting faster indefinitely.


You could get faster in expected value. What if the time on input of length n was an expected value of 100 + 1/n?

That isn't possible, but it might not quite be obvious why that isn't possible.


Having a constant runtime means there's a constant bound on the runtime, not that the runtime is an exact constant value. 100 + 1/n would still be constant, as it's bounded above and below by constants.

To have sub-constant time it would have to go to 0 as n increases, as otherwise we could bound the time below by a constant. This isn't possible, as a computer will take at least 1 time unit on every input. So the expected time will always be at least 1.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: