10

Possible Duplicate:
Are there any O(1/n) algorithms?

This just popped in my head for no particular reason, and I suppose it's a strange question. Are there any known algorithms or problems which actually get easier or faster to solve with larger input? I'm guessing that if there are, it wouldn't be for things like mutations or sorting, it would be for decision problems. Perhaps there's some problem where having a ton of input makes it easy to decide something, but I can't imagine what.

If there is no such thing as negative complexity, is there a proof that there cannot be? Or is it just that no one has found it yet?

Community
  • 1
  • 1
Tesserex
  • 17,166
  • 5
  • 66
  • 106
  • 6
    This wouldn't be "negative", it would be exponential (or otherwise) decay. O(n^-1) for instance. – Jamie Wong Jul 09 '10 at 02:02
  • 2
    It is possible (though not likely IMO) that some strange quantum entanglement effect could lead to a computer that outputs its results before the input is received. Would that qualify as negative complexity?? – Brian Gideon Jul 09 '10 at 02:47
  • 2
    They should give us a new badge for this thread. Something like "Time-Travaler" would be nice...lol. =] – mverardo Jul 09 '10 at 03:46
  • 1
    It would just be O(1). Vote for opening a funniest question list. – Mau Jul 09 '10 at 11:09
  • Thanks for the correction to inverse complexity. I was being very dumb when I said negative. Am I being made fun of here? – Tesserex Jul 09 '10 at 11:18
  • @Tesserex: I actually thought it was interesting question. By the way, I am not seeing how this is a duplicate of the cited question. Anyone want to take a stab at claiming that O(1/n) is the same concept as -O(n) for example? – Brian Gideon Jul 09 '10 at 11:57
  • @Tesserex: Also, I hope you do not think I was poking fun when mentioned the quantum entanglement thing...I was actually being half serious. What...the quantum world is strange! – Brian Gideon Jul 09 '10 at 12:00
  • No i meant the funniest question comment. And I think I actually meant 1/n, not negative n. The real question was "faster with larger input". – Tesserex Jul 09 '10 at 13:41
  • @Tessex: Please, don't get me wrong. I had "algorithm complexity" this year in college, and this was a great question in my student's point of view. =] – mverardo Jul 11 '10 at 15:17
  • This is not a duplicate question. One is a subset of the other. – JobHunter69 Dec 12 '18 at 00:10

7 Answers7

12

No that is not possible. Since Big-Oh is suppose to be an approximation of the number of operations an algorithm performs related to its domain size then it would not make sense to describe an algorithm as using a negative number of operations.

The formal definition section of the wikipedia article actually defines the Big-Oh notation in terms of using positive real numbers. So there actually is not even a proof because the whole concept of Big-Oh has no meaning on the negative real numbers per the formal definition.

Short answer: Its not possible because the definition says so.

Brian Gideon
  • 47,849
  • 13
  • 107
  • 150
6

update Just to make it clear, I'm answering this part of the question: Are there any known algorithms or problems which actually get easier or faster to solve with larger input?

As noted in accepted answer here, there are no algorithms working faster with bigger input. Are there any O(1/n) algorithms? Even an algorithm like sleep(1/n) has to spend time reading its input, so its running time has a lower bound.

In particular, author referes relatively simple substring search algorithm:
http://en.wikipedia.org/wiki/Horspool

PS But using term 'negative complexity' for such algorithms doesn't seem to be reasonable to me.

Community
  • 1
  • 1
Nikita Rybak
  • 67,365
  • 22
  • 157
  • 181
  • 1
    As the accepted answer there eventually acknowledged, there *are* no O(1/n) algorithms. The run time can decrease up to some finite n, but asymptotically, it cannot be o(1). – ShreevatsaR Jul 09 '10 at 02:14
  • 1
    But even here O(1/n) is not negative. Nevermind the fact that the formal definition constrains Big-Oh to being positive. – Brian Gideon Jul 09 '10 at 02:15
  • @ShreevatsaR Why're you telling this to me? In my post I don't claim those algorithms to be O(anything). – Nikita Rybak Jul 09 '10 at 02:16
  • I can't find _anything_ on the wiki pages making the claim that it's better than O(n) and the answer you link to states quite clearly : "Note that these algorithms will not exhibit runtime behaviour below O(1)". – paxdiablo Jul 09 '10 at 02:28
  • I"m pointing out that there are no algorithms that consistently get faster with larger input. Up to some size of input, yes. – ShreevatsaR Jul 09 '10 at 02:28
  • @ShreevatsaR Any algorithm is a subject to hardware limitations and we don't discuss them here. And apart from that, the algorithm 'sleep(1000/n)' seems like a good enough example to me. – Nikita Rybak Jul 09 '10 at 02:47
  • @paxdiablo I updated my post to answer such questions :) (even though original post addressed them too) – Nikita Rybak Jul 09 '10 at 02:48
  • @Nikita: Good link to the O(1/n) thing by the way. I actually think that is more interesting than this question. Though, I am not saying this question is not interesting. – Brian Gideon Jul 09 '10 at 03:02
  • @Brian On that I agree, there's not much to discuss about question 'can some task require negative time' :) – Nikita Rybak Jul 09 '10 at 03:07
  • No! Even theoretically (nothing to do with hardware limitations), an algorithm like sleep(1/n) has to take time ε + 1/n, because it has to *read* the input at least. (In fact, at least O(log n) to read the input itself.) – ShreevatsaR Jul 09 '10 at 03:11
1

To think in an algorithm that executes in negative time, is the same as thinking about time going backwards.

If the program starts executing at 10:30 AM and stops at 10:00 AM without passing through 11:00 AM, it has just executed with time = O(-1).

=]

Now, for the mathematical part:

If you can't come up with a sequence of actions that execute backwards in time (you never know...lol), the proof is quite simple:

positiveTime = O(-1) means:

positiveTime <= c * -1, for any C > 0 and n > n0 > 0

Consider the "C > 0" restriction. We can't find a positive number that multiplied by -1 will result in another positive number. By taking that in account, this is the result:

positiveTime <= negativeNumber, for any n > n0 > 0

Wich just proves that you can't have an algorithm with O(-1).

Community
  • 1
  • 1
mverardo
  • 475
  • 3
  • 9
0

Not really. O(1) is the best you can hope for.

The closest I can think of is language translation, which uses large datasets of phrases in the target language to match up smaller snippets from the source language. The larger the dataset, the better (and to a certain extent faster) the translation. But that's still not even O(1).

Joel Coehoorn
  • 399,467
  • 113
  • 570
  • 794
  • I also think that's the wrong "n". Wouldn't n be the number of snippets? You're translating them, not the data set, which would give you O(n) or worse. – paxdiablo Jul 09 '10 at 02:21
  • One that would really get people's heads spinning...is a noop O(1) or O(0)? – Brian Gideon Jul 09 '10 at 02:41
  • @BrianGideon I think it depends (both on the case and your definitions). If before the noop you had a command that requires to wait 1 clock before executing the next command (so you have to put noop between them), then you could argue that both the time it takes to read and to execute the noop is actually part of the time of the previous (or next) command (however it depends on your definition of the time of a command), in that case you could say It's O(0) since it doesn't add any time to the program. However if you add an unnecessary noop it adds time to the program and takes O(1). – Tomer Wolberg Nov 13 '20 at 13:19
0

Well, for many calculations like "given input A return f(A)" you can "cache" calculation results (store them in array or map), which will make calculation faster with larger number of values, IF some of those values repeat.

But I don't think it qualifies as "negative complexity". In this case fastest performance will probably count as O(1), worst case performance will be O(N), and average performance will be somewhere inbetween.

This is somewhat applicable for sorting algorithms - some of them have O(N) best-case scenario complexity and O(N^2) worst case complexity, depending on the state of data to be sorted.

I think that to have negative complexity, algorithm should return result before it has been asked to calculate result. I.e. it should be connected to a time machine and should be able to deal with corresponding "grandfather paradox".

SigTerm
  • 26,089
  • 6
  • 66
  • 115
  • There is actually a name for this type of programming, programs that finish before they start. I wish I could remember what that name is! – Tyler Jul 09 '10 at 07:11
0

As with the other question about the empty algorithm, this question is a matter of definition rather than a matter of what is possible or impossible. It is certainly possible to think of a cost model for which an algorithm takes O(1/n) time. (That is not negative of course, but rather decreasing with larger input.) The algorithm can do something like sleep(1/n) as one of the other answers suggested. It is true that the cost model breaks down as n is sent to infinity, but n never is sent to infinity; every cost model breaks down eventually anyway. Saying that sleep(1/n) takes O(1/n) time could be very reasonable for an input size ranging from 1 byte to 1 gigabyte. That's a very wide range for any time complexity formula to be applicable.

On the other hand, the simplest, most standard definition of time complexity uses unit time steps. It is impossible for a positive, integer-valued function to have decreasing asymptotics; the smallest it can be is O(1).

Greg Kuperberg
  • 3,995
  • 22
  • 24
0

I don't know if this quite fits but it reminds me of bittorrent. The more people downloading a file, the faster it goes for all of them

Tyler
  • 21,762
  • 11
  • 61
  • 90