> it is trivial to construct an algorithm with this property
Actually, it is impossible to construct an algorithm with this property, at least under any usual computational model.
Every computational model has discrete time units; therefore the amount of time taken cannot be vanishingly small.
(This is assuming the usual notion of complexity, where only the asymptotic behavior matters. It doesn't matter if it gets increasingly faster over the first million input sizes...it only matters what the behavior is as n -> infinity.)
Consider a hashmap. Hashmaps give expected constant access time, and amortised constant access time on the condition that their size grows at a sufficient rate and that the hash function is a universal hashing function. This is parametric.
Assume that you use a linkedlist for collisions, then for some growth rates, your cost of access is not constant, and is certainly at most O(n).
The point being that the universal hashing function has 1/m probability of collision; if your growthrate is bad. Then m << n, which turns the expected time from constant to at most O(N) because m grows slower than N.
"Faster as the input grows" is perfectly compatible with a minimum number of time units. If your definition insists the minimum of time units must be 0, with infinitesimal time getting involved, then your definition sucks.
The mathematically rigorous form of the posed question is whether there is an algorithm with run time f(N) for which there exists strictly decreasing g(N) such that f(N) = O(g(N)). That is, the run time is bounded by a forever decreasing function.
f(N) itself doesn't need to be strictly decreasing, but there needs to be a g(N) that is.
Bottom line, no algorithm can have a forever decreasing run time as N increases. An algorithm can exhibit that behavior over a limited range of N, but that's not how complexity works.
> The mathematically rigorous form of the posed question is whether there is an algorithm with run time f(N) for which there exists strictly decreasing g(N) such that f(N) = O(g(N)). That is, the run time is bounded by a forever decreasing function.
Let's say my minimum runtime is 7, and the strictly decreasing bound on my runtime is g(N) = 1e15/N + 7
Actually, it is impossible to construct an algorithm with this property, at least under any usual computational model.
Every computational model has discrete time units; therefore the amount of time taken cannot be vanishingly small.
(This is assuming the usual notion of complexity, where only the asymptotic behavior matters. It doesn't matter if it gets increasingly faster over the first million input sizes...it only matters what the behavior is as n -> infinity.)
A program cannot be increasingly faster forever.