Am trying to figure out the best way to capture time executed by a statement. For example what is the time taken by js stmt such as if(txt =='abc'). If i use console.time('test')/console.endTime('test') pair to calculate or simple js code (given below) to calculate time the console output varies every time code is executed.
Also difference seen on console sometimes 1 digit, 2 digit to 3 digit in milliseconds ie sometime 1 ms, 60ms & sometimes 800ms that too order is inconsistent. I even tried to run only once & close the browser & again run it so that other GC & variables does not come in picture to hamper the times but the result is same. few ms is undestandle but such huge difference is lot to get clear time taken. So what exactly is best & consistent way to get time executed by interpreter. How can we find best yardstick to prove which stmt performs better when written in certain way or best to use
var start = (new Date).getMilliseconds();
if(txt =='abc'){};
var diff = (new Date).getMilliseconds() - start;
console.log(diff);
*All test are conducted in FF. Also different approach such getMilliseconds, getTime & console.endTime as taken one after other *