Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> it is trivial to construct an algorithm with this property

Actually, it is impossible to construct an algorithm with this property, at least under any usual computational model.

Every computational model has discrete time units; therefore the amount of time taken cannot be vanishingly small.

(This is assuming the usual notion of complexity, where only the asymptotic behavior matters. It doesn't matter if it gets increasingly faster over the first million input sizes...it only matters what the behavior is as n -> infinity.)

A program cannot be increasingly faster forever.



Consider a hashmap. Hashmaps give expected constant access time, and amortised constant access time on the condition that their size grows at a sufficient rate and that the hash function is a universal hashing function. This is parametric.

Assume that you use a linkedlist for collisions, then for some growth rates, your cost of access is not constant, and is certainly at most O(n).

The point being that the universal hashing function has 1/m probability of collision; if your growthrate is bad. Then m << n, which turns the expected time from constant to at most O(N) because m grows slower than N.


"Faster as the input grows" is perfectly compatible with a minimum number of time units. If your definition insists the minimum of time units must be 0, with infinitesimal time getting involved, then your definition sucks.


For a function to strictly decrease forever, at least one of two statements is true:

1. It decreases by vanishingly smaller amount.

2. It decreases to negative infinity.

So...which is it? Infinitesimal time units, or negative time?


Strictly speaking it either tends to negative infinity, or tends to so real number X, which could be any number positive or negative.


In that latter case, statement 1 is true.


3. Nobody said "strictly".


They said decreases, not stays constant


Plenty of O(n), O(nlogn), O(n^2) algorithms don't strictly increase.

Rounding the decrease to the nearest time unit some of the time is hardly a disqualifier for "decreasing". (Though specifically they said "faster".)


Yes, but...

The mathematically rigorous form of the posed question is whether there is an algorithm with run time f(N) for which there exists strictly decreasing g(N) such that f(N) = O(g(N)). That is, the run time is bounded by a forever decreasing function.

f(N) itself doesn't need to be strictly decreasing, but there needs to be a g(N) that is.

Bottom line, no algorithm can have a forever decreasing run time as N increases. An algorithm can exhibit that behavior over a limited range of N, but that's not how complexity works.


> The mathematically rigorous form of the posed question is whether there is an algorithm with run time f(N) for which there exists strictly decreasing g(N) such that f(N) = O(g(N)). That is, the run time is bounded by a forever decreasing function.

Let's say my minimum runtime is 7, and the strictly decreasing bound on my runtime is g(N) = 1e15/N + 7

Doesn't that fit your description fine?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: