30

Possible Duplicate:
Plain English explanation of Big O

Many times when time complexity of a algorithm is talked about, memory also comes into account. I want to know what is the meaning of big-O(1), big-O(n), big-O(n*n) memory?

And how it is related to time complexity?

Community
  • 1
  • 1
Eight
  • 4,194
  • 5
  • 30
  • 51
  • Dupicate: http://stackoverflow.com/questions/3739768/what-is-big-o-notation – Stealth Rabbi Nov 22 '11 at 14:53
  • Measures complexity, not memory. – Stealth Rabbi Nov 22 '11 at 14:54
  • I presume you are talking about the "Big Oh" (or "Big O"). This is explained rather elaborately in the following wikipedia page: http://en.wikipedia.org/wiki/Big_O_notation – Yuri Nov 22 '11 at 14:56
  • 3
    Do you mean o(n) or O(n)? There's quite a difference. – Fred Foo Nov 22 '11 at 14:56
  • @NullUserExceptionఠ_ఠ: well, the OP does explicitly ask for the relation between space and time complexity, which isn't covered there. I'm not voting to close as dupe, although I do find this question rather broad. – Fred Foo Nov 22 '11 at 15:02

5 Answers5

29

As xmoex said:

o(1) constitutes a constant memory usage. So amount of input is inconsequential.

o(n) constitutes a linear memory usage. So more input means linearly more memory.

o(n*n) constitutes a quadratic memory usage. So more input means quadratically more memory (x^2 on average.

This measure of memory complexity in most cases is completely independent to the measure of time complexity. For computer algorithms it is important to know how the algorithm will manage both of these complexities to decide the quality of the algorithm. However both must be calculated separately. One may be more important than the other depending on your use cases and circumstances for the problem.

mikecline
  • 376
  • 3
  • 8
  • 1
    -1 for the "completely independent" remark. When an algorithm takes o(f(n)) space (that's little-o), it also takes o(f(n)) time. – Fred Foo Nov 22 '11 at 17:20
  • This is not always the case. Yes it is possible that there is the same complexity for memory and for time. However, it is also possible to have an algorithm that requires less time than it does space. It is possible to have space complexity be o(f(n)) and have time complexity be > o(f(n)). – mikecline Nov 22 '11 at 18:38
  • that's little-o, so it's a lower bound. – Fred Foo Nov 22 '11 at 18:39
  • 4
    Again that is incorrect. Lowerbound is omega not lowercase omicron. Little-o means that it is guaranteed to be lower not lower than or equal to. Big-O means <=, little-o means <, big-Omega means >=, little-Omega means >, theta means both <= and >=, and ~ means for all extensive purposes =. – mikecline Nov 22 '11 at 18:42
  • 1
    You're right, I got my notation wrong. However, the "completely independent" is still not true. You can't take less time than space, at least not on a Turing machine. – Fred Foo Nov 22 '11 at 18:50
  • But it is possible to break things into "threads" with turing machines, thus something could take less time because things are running in parallel than it takes space to represent the problem. – mikecline Nov 22 '11 at 19:07
  • only if you have a non-deterministic TM can you freely make "threads", and they aren't nondet "by default". – Fred Foo Nov 22 '11 at 19:19
  • Okay yes that is true I am not arguing there. We do not know the context of the problem, nor do we know the capabilities of the machine(s) being used. I agree that in a deterministic TM yes the two are not completely independent and I agree with all of you statements. However, if we allow for non-deterministic TMs, than do you disagree with what I have said? In this world where the fun and challenging problems live in the NP-world I think that non-deterministic TMs will be very useful. – mikecline Nov 22 '11 at 19:28
  • I think we're straying very far off topic. Most of the times you need a big- or little-o measure of complexity, you're working on something that's closer to a det TM than a nondet TM. – Fred Foo Nov 22 '11 at 20:31
8

o(1) means constant average memory use, regardless the size of your input
o(n) means if you have n element you are processing, your average memory need grows linear
o(n*n) means if you have n elements you are processing, your average memory need will grow quadratic

there's a wiki article about the so called big o notation (covering little o as well...)

xmoex
  • 2,602
  • 22
  • 36
2

Complexity in terms of mamory mean how fast required memory size growth whilst growing a number of items to be processed. A good example is sorting algorithm.

  • O(1) and O(log n) mean that whilst sorting N items algorithm requires less memory then total memory allocated for N items. (AKA in-place sorting)
  • O(n) - memory consumption is linear so memory consumption growth as long as items count growth
  • O(n*n) mean that algorithm requires much more additional memory.
sll
  • 61,540
  • 22
  • 104
  • 156
  • *"O(n * n) mean that algorithm requires much more additional memory"* that is very vague. It would be better to specify that it requires a *quadratic amount* of memory. Still +1 for the easy explanation. – adelriosantiago Aug 05 '20 at 06:58
2

Here everyone have explained the meaning of Big O notation. So , i m not going to explain that again . But i will explain u in brief.

Take any small program in which there is no loops.

{ int a=1;
print("%d",a);
}

This program will take negligible time to execute. Let, the declaration and printing take be unit time. So its time complexity will be O(1)

Another program with one loop and running for n times

{int a,i;
long n=10000000;
for(i=0;i<n;i++)
    // doing some calculations
}

As u can see here the declaration will take negligible time i.e. O(1). And if we let that line 4 will take some unit of time i.e. O(n). Then, the overall time complexity will be

O(1)+O(n)=O(n).

Now u can understand for O(n*n) i.e. for 2 loops.

For better understanding....

Finding an item in an unsorted list = O(n)

Multiplying two n-digit numbers by a simple algorithm or bubble sort =O(n*n)

Finding an item in a sorted array with a binary search =O(log n)

salesman problem with brute force = O(n!)

Ravi Kumar
  • 123
  • 7
1

I am not sure if you mean big-O or little-O here, but I am going to answer more generally.

It means the same thing for memory as it does for time. If a function grows in memory O(1) then it uses a constant amount of memory regardless of the input size. If a function grows in O(n) it uses a linear amount, and O(n*n) it uses a quadratic amount.

Nick Garvey
  • 2,980
  • 24
  • 31