With probabilistic algorithms, you still have the fundamental limitation that there's a lower-bound to how fast an algorithm can run. You can't keep getting faster indefinitely.
Having a constant runtime means there's a constant bound on the runtime, not that the runtime is an exact constant value. 100 + 1/n would still be constant, as it's bounded above and below by constants.
To have sub-constant time it would have to go to 0 as n increases, as otherwise we could bound the time below by a constant. This isn't possible, as a computer will take at least 1 time unit on every input. So the expected time will always be at least 1.