0

What is the complexity of T(n)=1+2+3+...+n? I know that the answer is O(n^2). What is an example of an algorithm that has runtime T(n)?

EDIT: I was not talking about computing the sum 1+2+3+...+n, that is not the objetive.

Kamoukou
  • 123
  • 1
  • 2
  • 6
  • That can be done in `O(n)`, why do you think it's `O(n^2)`? – Spencer Wieczorek Mar 27 '17 at 22:52
  • I remember i have done this once in calculus, this sum 1+2+3+...+n is equal to n(n+1)/2 , and by performing multiplication we can say that it is O(n^2). Please correct me if i'm wrong. – Kamoukou Mar 27 '17 at 22:54
  • `n(n+1)/2` is simply a constant which is `O(1)`. To be clear here `n != O(n)`, that is referring to the actual number `n`. – Spencer Wieczorek Mar 27 '17 at 22:56
  • @Kamoukou I heavily edited your question, as various commenters and answerers were assuming you intended to ask about the time complexity of a formula, rather than asking about its complexity. Please comment if my edit was incorrect. – Paul Hankin Mar 27 '17 at 23:09
  • @PaulHankin Thank you sir! Sorry for not being so precise! – Kamoukou Mar 27 '17 at 23:13

4 Answers4

6

What is an example of an algorithm that has runtime T(n)?

If you have an outer loop that iterates n times and an inner loop that iterates i times where i is the index of the outer loop, the body of the inner loop will be executed T(n) times.

An example of such a nested loop would be the following algorithm:

for i from 1 to n
    for j from 1 to i
        print "$j "
    print "\n"

Which is the solution to a common homework assignment and prints a number pyramid of the following shape:

1
1 2
1 2 3
sepp2k
  • 363,768
  • 54
  • 674
  • 675
  • Upvoted, but it's debatable if `print "$j "` is O(1) since it outputs ~log_10(j) characters. If you count outputting 1 character as the basic operation, then the example is O(n^2 log n). If you consider the print statement as O(1) no matter what's being printed then it's O(n^2). – Paul Hankin Mar 27 '17 at 23:58
  • 1
    @PaulHankin I think I covered my ass in that regard by just claiming that the inner loop's body is executed `T(n)` times (though admittedly that wasn't quite what was asked for) ;-) You can solve that problem by printing "* " instead of "$j ". That might actually be the more common variant of the homework assignment anyway. – sepp2k Mar 28 '17 at 00:21
3

While @sepp2k gives an awesome answer, here is some real life algorithms which may have T(N) = 1+2+3+...+n(thus we called such algorithm O(n^2))


Insertion Sort at its worst case: It has a outer loop which loop n times, and for inner loop, it loops the sorted portion of the array to find a position to insert the next element, where the sorted portion increases by 1 for each outer loop iteration, so the inner loop at worst case will runs 1+2+3+4+..+n-1 times

Longest Increasing Subsequence(LIS) using naive implementation: Direct implementation of the recurrence relation which for each iteration i, we have to look through all j < i

Interval Graph Coloring(Interval Partitioning) using naive implementation: After sorting the intervals, for each interval x, we have to find how many intervals before x conflicting with x, and the answer to the problem is the maximum conflict number. So while the outer loop is looping each interval i, the inner loop is looping for all j < i

Prime Generating using naive primality test: Of course, we can generate all primes within n, using naive primality test on all i within n. That is, for all i, we loop for all j < i to see if j divides i


Indeed there are many algorithms which contains such structure, and most of them I have seen can be improved by using a more brilliant algorithm or advanced data structure.

  • Quick Sort / Merge Sort
  • LIS using binary search
  • Interval Partitioning using priority queue
  • Any prime sieve algoriothms
Community
  • 1
  • 1
shole
  • 4,046
  • 2
  • 29
  • 69
0

Insertion sort, in the worst case, requires T(n)=1+2+3+...+n bounded-time steps. Selection sort requires that many steps every time, but it's not as common.

Matt Timmermans
  • 53,709
  • 3
  • 46
  • 87
-2

The order of growth of a function has to do with the number of operations you need to complete. An example of a function with O(n) would be the slow version of the function you mentioned in the question. In python:

total = 0
for i in range(n + 1):
    total += i

This would be O(n) because the amount of time it takes to complete will scale linearly based on n.

As for the function you gave as an example in the question, it's actually O(1) because you only need to do one operation: n(n+1)/2. The amount of time it takes to compute n(n+1)/2 will never change bar back-end quirks, but for something like this, that really shouldn't ever matter.

Unlocked
  • 648
  • 6
  • 19
  • 1
    The question wasn't how long it takes to calculate the sum, it was about expressing the function T(n) asymptotically as big-O. – interjay Mar 27 '17 at 23:20
  • The question has been edited. It was about alternative examples to the (incorrect) assumption of O(n^2) to compute the result. The result of the stated example can be computed O(1), which is a true and correct alternative. O(n) is trivial if you really want that: just loop and sum. O(n^2) is wrong because this isn't T(n) = 1 + (1+2) + (1+2+3) + ... + (1+ 2 +3+...+n), but now we are arguing semantics instead of math, which has already been solved. – Steve Harris Mar 27 '17 at 23:25
  • 1
    @SteveHarris The intention of the question was the same before and after the edit. T(n) is not an algorithm, but the runtime of another algorithm. The question asks which algorithm has a runtime of T(n). And the question is correct in saying that T(n) is in the class of O(n^2). – interjay Mar 27 '17 at 23:28
  • @SteveHarris The question was never about how long it takes to compute `T`. `T` was always intended as a mathematical function that describes some algorithm's time complexity. That's why it's called `T` and why the OP asks (and always asked) for "an example of an algorithm that has runtime T(n)". The phrase "has runtime T(n)" makes no sense when interpreting the question any other way. The function `T(n) = 1 + ... + n` is definitely in O(n^2). That is, if you have an algorithm that takes 1 + ... + n steps for an input of size n, that algorithm has quadratic time complexity. – sepp2k Mar 27 '17 at 23:28