2

I am taking some online introduction to algorithms course. And trying to understand while improving my javascript on the side. I think the following is correct but is there any way to improve it?

function randomGenerator() {

  var x;
  var myArray = [];
  var copy = [];
  for (x = 1; x <= 10000; x += 1) {
    myArray.push(x);
  }

  var m = myArray.length;
  var i;

  while (m) {
    i = Math.floor(Math.random() * myArray.length);
    if (i in myArray) {
      copy.push(myArray[i]);
      delete myArray[i];
      m--;
    }
  }
  return copy;
}

I think the complexity here is O(n^3) because of the first for loop, the following while loop and the final if statement. Am I going about this in the right way?

EDIT: Found a better way to do this thanks to @Dukeling.

function randomGenerator() {    
var x;
var myArray = [];
var t;
for ( x = 1; x <= 10000; x+=1) {
myArray.push(x);
}

var m = myArray.length;
var i;

while (m) {
i = Math.floor(Math.random() * m-=1);

t=myArray[m];
myArray[m]=myArray[i];
myArray[i]=t;

}
return myArray;
}

The complexity now reduces to O(n) linear time. (Please correct me if wrong)

  • 1
    `if` is not a loop. – Barmar Jun 01 '17 at 22:46
  • 1
    Big-O notation relates the running time to the size of the input. Your function doesn't take any input, so there's nothing to relate to. – Barmar Jun 01 '17 at 22:48
  • This is `O(1)`. There's no `n` in the code. – Bergi Jun 01 '17 at 22:51
  • 1
    Only nested loops multiply the number of iterations. Subsequent loops just sum them. – Bergi Jun 01 '17 at 22:51
  • [How to find time complexity of an algorithm](https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm) – hatchet - done with SOverflow Jun 01 '17 at 22:55
  • @Dukeling Will all questions regarding computational complexity now be closed as duplicates? IMHO the wrong approach. This one e.g. asks us to validate a given complexity of a randomized algorithm. Who benefits from the redirect to the "canonical"? – le_m Jun 02 '17 at 13:03
  • @Bergi Worst-case runtime complexity of this randomized algorithm has no upper bound. Would you still call that `O(1)`? For the average runtime complexity, I agree. – le_m Jun 02 '17 at 13:13
  • "In general, randomized algorithms are allowed to have infinite computations. In that case one measures the expected time complexity over the finite computations of A only, and calculates the probability of running an infinite computation, which is then added to the error probability (i.e., the occurrence of an infinite computation is considered to be an error). [...] The time complexity is not considered in such cases" - so the *expected* time complexity is indeed `O(1)` while the (worst-case) time complexity is not considered here. – le_m Jun 02 '17 at 13:22
  • 1
    @le_m Not all, just very basic ones that have no future value and where OP will presumably be better off just reading a canonical answer to get a basic understanding of the topic at hand first. This appeared to fall into that category at first glance, but I may have missed the randomness aspect and closed it too quickly. This question still needs a rewrite for it to have much future value i.t.o searchability and making the non-trivial nature of the question more noticeable, and perhaps such a rewrite is more than what anyone other than OP should be doing, which is a decent argument for closure – Bernhard Barker Jun 02 '17 at 14:00
  • You may want to look into [shuffling algorithms](https://en.wikipedia.org/wiki/Fisher%E2%80%93Yates_shuffle). – Bernhard Barker Jun 02 '17 at 14:15

1 Answers1

3

Your function doesn't take any input, so I'm going to assume you want to get the complexity related to the size of the array that you create in the first for loop.

You don't multiply the two loops because they're not nested, they're sequential. So the complexity is the worst complexity of the two loops.

The for loop is O(n) because it simply increments x from 1 to 10000.

The while loop is more complex. It only decrements m when it finds the random index i in the array. When the array is mostly full, most indexes will be found. But as it becomes more sparse, this is likely to fail more. So the number of iterations it takes to delete the next element and decrement m is inversely proportional to how full the array is. On average it will take 10000/fullness iterations for each decrement. At the beginning this is near 1, while at the end it will take about 10,000 iterations. So the total time averages 1 + 2 + ... + n = n * (n+1)/2. This makes the while loop O(n2).

if() is not a loop, it either executes the body or not, so it's O(1).

This makes the overall complexity O(n2).

Barmar
  • 741,623
  • 53
  • 500
  • 612
  • "You don't multiply the two for loops because they're not nested, they're sequential. So the complexity is the worst complexity of the two loops." I don't understand this line...I have only one 'for loop' in my program – neutralCreep Jun 01 '17 at 23:07
  • I meant the two loops, both the `for` and `while` loops. – Barmar Jun 01 '17 at 23:07
  • Ah I understand now..Thank-you for your time! – neutralCreep Jun 01 '17 at 23:12
  • 1
    It's fairly easy to hide O(n) (or even slower) operations in a single statement (like a "contains" check that does a linear search). Based on your answer (and my limited JavaScript knowledge), I'm assuming you're saying the if statement is O(1) because `in`, `push` and `delete` in JS are all O(1). – Bernhard Barker Jun 02 '17 at 08:31
  • @Dukeling That's true, you can't necessarily be sure about built-in operations and functions like that. If `push()` had to reallocate a copy of the array every time, it would be O(n), but AFAIK the Javascript array implementation (like vector implementations in many languages) allocates enough space so that most of the time it doesn't, so it's asymptotically O(1). – Barmar Jun 02 '17 at 19:55