-2

I found this rule about algorithms for analysis:

O(max{f(n),g(n)}) = O(f(n)+g(n)) 

How to prove this?

I know:

max{f(n),g(n)} <= f(n)+g(n) <= 2 * max{f(n),g(n)}

thus:

max{f(n),g(n)} is O(f(n)+g(n))
max{f(n),g(n)} is O(max{f(n),g(n)})
f(n)+g(n)      is O(max{f(n),g(n)})

But then

If a is O(b) and b is O(a) then O(a) = O(b) ?

trincot
  • 317,000
  • 35
  • 244
  • 286
Bruce Zu
  • 507
  • 6
  • 17

3 Answers3

5

To prove that O(max{f(n),g(n)}) = O(f(n)+g(n)), we can use the formal definition of big-O:

f(x) = O(g(x)) if and only if there exists a positive real number M and a real number x0 such that
|f(x)| ≤ M|g(x)| for all x ≥ x0.

The absolute value applied in this definition really is a theoretical issue, as in practice only functions are used in the big-O notation which always give positive values for large enough x. It makes no sense to specify a negative big-O complexity. So I will not use that absolute value in the rest of this answer, but assume the functions yield positive values.

For grasping the concept of big-O, it might be worth to read this short article about Big-O misconceptions.

Proof that if s(n) is O(f(n)+g(n)) then s(n) is O(max{f(n), g(n)})

Using the above definition, we can say of some function s(n):

s(n) is O(f(n)+g(n)) if and only if there is an M such that
|s(n)| ≤ M(f(n) + g(n)) for large enough n, which is equivalent to:
|s(n)| ≤ M·max{f(n), g(n)} + M·min{f(n), g(n)} for large enough n.

For that same M the following would then also be true -- here we depend on the assumption that both f(n) and g(n) are positive for large n:

|s(n)| ≤ 2M·max{f(n), g(n)} for large enough n.

If we now choose P to be 2M, then we can say we have a P for which:

|s(n)| ≤ P·max{f(n), g(n)} for large enough n.

...which according to the definition of big-O means that

s(n) is O(max{f(n), g(n)})

Proof that if s(n) is O(max{f(n), g(n)}) then s(n) is O(f(n)+g(n))

s(n) is O(max{f(n), g(n)}) if and only if there is an P such that
|s(n)| ≤ P·max{f(n), g(n)} for large enough n.

Because for positive numbers (cf. the assumption) max{f(n), g(n)} < f(n)+g(n), this implies that the following is certainly true (we increased the right hand of the inequality):

|s(n)| ≤ P(f(n) + g(n)) for large enough n.

...which according to the definition of big-O means that

s(n) is O(f(n)+g(n))

Conclusion

The above proves that if any function is O(f(n)+g(n)) then it must also be O(max{f(n),g(n)}), and vice versa. This is the same as saying that both big-O complexities are the same:

O(f(n)+g(n)) = O(max{f(n),g(n)})

Note that this is not about equality of functions, but equivalence of big-O expressions.

In fact, you could look at big-O expressions as sets of functions, i.e. sets of those functions that have corresponding big-O complexity. So then the above proves:

s(n) ∈ O(f(n)+g(n)) ⇔ s(n) ∈ O(max{f(n),g(n)})

and this means both O-sets are the same.

The Necessary Assumption

We need the (practical) assumption that both f(n) and g(n) are always positive for large enough n. They can be negative and/or undefined on some subsets of ℝ, but there must be an n above which f(n) and g(n) always yield a non-negative result. If this were not the case, then the premise can be proved false with a simple counter example:

g(n) = n
f(n) = -n

The premise O(max{f(n),g(n)}) = O(f(n)+g(n)) then becomes:

O(n) = O(0)

which evidently is false. This is because f(n) violates the assumption and is always negative for large n. But again, a negative complexity makes no practical sense.

To be clear: these big-O functions may be negative or even undefined on some subsets of ℝ. As long as there is an n above which they always yield a non-negative number, they are in line with the assumption.

Examples of allowable functions that yield negative results and/or are undefined on subset(s) of ℝ:

n
log(n)
n³
sqrt(n)

Examples of functions that violate the assumption:

sin(n)
cos(n)
tg(n)
-n
Community
  • 1
  • 1
trincot
  • 317,000
  • 35
  • 244
  • 286
  • with @trincot help, now I can understand s(n) is O(f(n)+g(n)) and s(n) is O(max{f(n), g(n)}), but not sure based on these we can then reach O(f(n)+g(n)) = O(max{f(n), g(n)}) . E.g. logn is O(n) , logn is O(nlogn), but can we say O(n)=O(nlogn)? – Bruce Zu Jan 27 '16 at 18:23
  • I have extended the detail of the proof in my answer so it runs in both directions, which proves that both big-O complexities are synonymous. The example you give does not follow the pattern of the proof. You are varying the complexity to *O(n)* and *O(nlogn)*, but the proof says nothing about those, and indeed, it is possible to prove these two are not equal. You do not show how from `logn is O(n)` you imply `logn is O(nlogn)`: it cannot be done. – trincot Jan 27 '16 at 19:02
0

This answer assumes f(n), g(n) >=0 for all n >= 0. I am using these assumptions because no algorithm (that I know of) can have negative space or time usage under any circumstance.

You have no way of knowing what f(n) and g(n) are, so you can't choose either one to be the upper bound.

Therefore, the only option left that is guaranteed to be bigger than both f(n) and g(n) is f(n)+g(n).

Matthew Pope
  • 7,212
  • 1
  • 28
  • 49
0

If we assume that f(n) & g(n) >= 0 for all values of n , then their sum would be greater than any one of them. Without this assumption, the relation will not hold. We can logically try few examples.

Max(f(n) = 1 & g(n) = 3)  < ((f(n) + g(n) = 4)
Max(f(n) = 5 & g(n) = 0)  = ((f(n) + g(n) = 5)
Max(f(n) = 0 & g(n) = 8)  = ((f(n) + g(n) = 8)
So f(n) + g(n) will always upperbound O(Max(f(n),g(n)))
i.e. O(Max(f(n),g(n))) = O(f(n) + g(n))

We can also prove the fact : Max(f(n),g(n)) < ((f(n) + g(n)) <= 2*Max(f(n),g(n)) easily by the fact that we are taking the max of two always. There are two cases

1: Both values are different: In this case the sum will be less than twice the max.

Max(f(n) = 4 & g(n) = 1)  < ((f(n) + g(n) = 5) < 2*Max(f(n) = 4 & g(n) = 1)
Max(f(n) = 3 & g(n) = 0)  = ((f(n) + g(n) = 3) < 2*Max(f(n) = 3 & g(n) = 0)

1: Both values are same: In this case the sum will be exactly twice the max.

Max(f(n) = 1 & g(n) = 1)  < ((f(n) + g(n) = 2) = 2*Max(f(n) = 1 & g(n) = 1)
Max(f(n) = 0 & g(n) = 0)  = ((f(n) + g(n) = 0) = 2*Max(f(n) = 0 & g(n) = 0)

Thus

i.e. Max(f(n),g(n))  < ((f(n) + g(n)) <= 2*Max(f(n),g(n))
chettyharish
  • 1,704
  • 7
  • 27
  • 41