Example:
f(n) = n
g(n) = 0.5n
Notice that when c = 1 and n0 = 1, n >= 1(0.5n) for all n >= n0.
However, the definition states that "there is" a real constant c > 0, not "for all". So in this case, if we choose c = 3 and n0 = 1, n <= 3(0.5n) for all n >= n0 holds, thus, n is O(0.5n)?
If so, what's the semantic meaning of the statement, n is O(0.5n)?
n's behaviour towards infinity is similar to the behaviour of 0.5n?