0

Am trying to figure out the best way to capture time executed by a statement. For example what is the time taken by js stmt such as if(txt =='abc'). If i use console.time('test')/console.endTime('test') pair to calculate or simple js code (given below) to calculate time the console output varies every time code is executed.

Also difference seen on console sometimes 1 digit, 2 digit to 3 digit in milliseconds ie sometime 1 ms, 60ms & sometimes 800ms that too order is inconsistent. I even tried to run only once & close the browser & again run it so that other GC & variables does not come in picture to hamper the times but the result is same. few ms is undestandle but such huge difference is lot to get clear time taken. So what exactly is best & consistent way to get time executed by interpreter. How can we find best yardstick to prove which stmt performs better when written in certain way or best to use

var start = (new Date).getMilliseconds();
if(txt =='abc'){};
var diff = (new Date).getMilliseconds() - start;
console.log(diff);

*All test are conducted in FF. Also different approach such getMilliseconds, getTime & console.endTime as taken one after other *

user593029
  • 511
  • 7
  • 18
  • You might find some help from http://stackoverflow.com/questions/2140716/how-to-profile-and-and-get-javascript-performance – NG. Aug 24 '12 at 15:10
  • The single if-statement will need an unmeasurable amount of time. It depends on the scheduler whether the execution is halted and some other computations (UI thread, garbage collection etc) will take place before going on… – Bergi Aug 24 '12 at 15:12
  • Use a tool like [jsperf](http://jsperf.com). – Sirko Aug 24 '12 at 15:13
  • I think the discussion is concentrate on memory leaks but i have avoid this while my testing by closing the browser & running it fresh, so that could not be the issue to hamper the time calculation – user593029 Aug 24 '12 at 15:13
  • Let me know if you've found a consistent, reliable way to benchmark various JS engines... you'd be the first person to work it out – Elias Van Ootegem Aug 24 '12 at 15:16
  • Right now am working only one i.e is FF & all results described occurs in FF – user593029 Aug 24 '12 at 15:17

2 Answers2

1

Erm.. you're using two very different functions to retrieve current timestamps.

Date.prototype.getMilliseconds()

returns another (shorter) number than

Date.prototype.getTime()

So that is not a good idea in general and is most likely the reason for your problems. If you want to measure in-code, you should always go with .getTime() or, better, Date.now(). Both return a full timestamp number.

jAndy
  • 231,737
  • 57
  • 305
  • 359
  • Does not matter try both everytime you run or even close the browser the test result varies drastically without even following any pattern – user593029 Aug 24 '12 at 15:15
  • it matters a lot. Just because with mixins those two methods it can never work correctly. – jAndy Aug 24 '12 at 15:17
  • they were tried one after the other where in the mixin has come into picture – user593029 Aug 24 '12 at 15:18
  • @user593029: however, I can't re-create your issue when using `Date.now()` or `new Date().getTime()`. Its always 0 or 1 ms. – jAndy Aug 24 '12 at 15:21
  • I think you got it wrong from my code (there was typo) i have modified it from getTime to getMilliseconds. If you use getTime you will get 0ms everytime so try getMilliseconds to get values in ms – user593029 Aug 24 '12 at 15:22
  • @user593029: works for me too. Always `0`. Anyway, measuring with `.getMilliseconds()` is still a bad idea, since it only returns numbers 0-999 and therefore is very inaccurate. – jAndy Aug 24 '12 at 15:26
0

The point here is that time to execute this instruction is in most cases much less than 0ms, so I don't understand why are you getting those differences. If I run on my machine:

for (x = 0; x < 100; x++) {                       
    var start = (new Date).getMilliseconds();
    if(txt =='abc'){};
    var diff = (new Date).getMilliseconds() - start;
    print(diff);
}

I get:

0
0
0
0
...