1

If I do a simple loop with a while, it is evaluating my ++ operator after the first loop, does this mean that my while operator makes a small scope saving my i and after it finishes evaluating it applies the ++ operator on it? Why does this happen with this operator and also happens on the -- but not with lets say a simple sum i + 1?

var i = 0;
while(i++ < 1) { console.log(i) }

My output is 1

var i = 1;
while(i-- > 1) { console.log(i) }

My output is 0

var i = 0;
while(i + 1 < 1) { console.log(i) }

My output is undefined

Santiago Suárez
  • 586
  • 7
  • 12
  • There’s a difference between `i++` and `++i`. – Sebastian Simon Oct 28 '15 at 20:34
  • i++ is postfix increment operator. The value is assigned and then incremented. https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Arithmetic_Operators – danronmoon Oct 28 '15 at 20:37
  • `while(i + 1 < 1)` try with `while(i + 1 < 2)` – Anonymous0day Oct 28 '15 at 20:38
  • 1
    http://stackoverflow.com/questions/484462/difference-between-i-and-i-in-a-loop – epascarello Oct 28 '15 at 20:40
  • the while statement never gets ran because (i + 1 < 1) is a false statement. Console.log(i). gets put on the event que with a refernece too an execution context that contains a reference to i, but because the while statement never runs that execution context is destroyed before the console.log statment gets put back on the stack. So by the time it runs the i that is refencing refers to a execution context that is no longer there, hence the undefined return. This has alot to do with how closures work in javascript. – TimCodes Oct 28 '15 at 20:43

3 Answers3

2

i++ first evaluates and then increments.

++i first increments and then evaluates whatever expression it is in.

var i = 0;
while(i++ < 1) { console.log(i) }

First you get 0 < 1, it gets incremented, it enters the loop and printed as 1. console.log() runs once. The next iteration would compare 1 < 1 which returns false. Do note that i is 2 in the end, because this second comparison increments it again.

var i = 1;
while(i-- > 1) { console.log(i) }

Here evaluates to 1 > 1. You will not get a console.log, but it will be 0 because it gets decremented after being evaluated.

var i = 0;
while(i + 1 < 1) { console.log(i) }

Here the incrementation is done instantly, so you get 1 < 1 which doesn't evaluate as true, thus you don't enter the loop. i should be 0 in the end because it isn't modified.

George Irimiciuc
  • 4,573
  • 8
  • 44
  • 88
1

For the first statement, the order of execution is:

  1. is i (currently 0) < 1 ? if so, increment i and enter loop, else increment i.
  2. output i (1)

The second is largely the same deal but your final statement is quite clearly never going to execute. we take zero, add 1 to it and then compare to 1. Thus your statement is the equivalent of while (1 < 1) which is clearly never true.

Olipro
  • 3,489
  • 19
  • 25
0

I think you're confusing i++ and ++i (this isn't because of the loop):

  • i++ evaluates to i and then increments i
  • ++i increments i and then evaluates to i

Some examples:

var i = 0;
console.log(i++); // ⇒ 0
console.log(i);   // ⇒ 1

i = 0;
console.log(++i); // ⇒ 1
console.log(i);   // ⇒ 1

i = 1;
console.log(i++); // ⇒ 1
console.log(i);   // ⇒ 2

You can find more information here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Arithmetic_Operators#Increment_()

Leo2807
  • 64
  • 3