0

I have a function that needs to run on every keypress and I want to know if it is doing too much work. To find that out I would time it. If it is less than a millisecond I would have to iterate over it 1k - 1 million times then divide by that amount.

While this may seem broad I'm going to ask anyway, what is the max amount of time a function should take to run in 2015? And is there a standard?

In video games if you want to hit 30 frames per second then you have 33 milliseconds to play with since 1 second equals 1000 milliseconds and 1000ms/30frames per second equals 33milliseconds. If you aim for 60 frames per second you only have 15 milliseconds to work with. So at what point would you say to your fellow software developer that their function takes too long? After, 1 millisecond? 100 milliseconds?

For example, if you were to use JQuery to find an element on a page of a 1000 elements how long should that function take? Is 100 ms too long? Is 1 ms too long? What standards to people use to go by? I've heard that database administrators have a standard search time for queries.

Update:
I really don't know how to word this question the way I meant it to. The reason I ask is because I do not know if I need to optimize a function or just move on. And if I do time a function I don't know if it's fast or slow. So, I'm going to ignore it if it's less than a millisecond and work on it if it's longer than a millisecond.

Update 2:
I found a post that describes a scenario. Look at the numbers in this question. It takes 3ms to query the database. If I was a database administrator I would want to know if that is taking too long to scale to a million users. I would want to know how long it should take to connect to the database or perform a query before adding another server to help load balance.

Community
  • 1
  • 1
1.21 gigawatts
  • 16,517
  • 32
  • 123
  • 231
  • 1
    There is no limit for me. Your function can never return. But for other people, I think faster execution is better... – apocalypse Oct 08 '15 at 07:49
  • I came to the conclusion that if it's less than a millisecond I'm not going to worry about it. If it takes longer than that I'll refactor it. BTW the profiling tools I have at my disposal only measure in millisecond accuracy. So when I call getTimer() before the call and then call getTimer() after the call it may show the difference in time as zero. – 1.21 gigawatts Oct 08 '15 at 21:05

1 Answers1

2

Okay. I think you're asking the wrong question. Or maybe thinking about the problem in a way that's not likely to yield productive results.

There's absolutely no rule that says, "your function should return after no more than X milliseconds". There are plenty of robust web applications that utilize functions that may not return for 250 milliseconds. That can be okay, depending on the context.

And, keep in mind that a function that runs for, say 3 milliseconds, on your dev machine, may run much faster or much slower on someone else's machine.

But here are some tips to get you thinking a little bit more clearly:

1) It's all about the user. Really (and I kind of hesitate to say this because someone is going to take me too literally and either write bad code, or start a flame war with me), but as long as the performance of your code doesn't affect the user experience, your functions can take as long as you want to do their business. I'm not saying you should be lazy in your code optimization; I'm saying as long as the user doesn't perceive any delay, you don't really need to stress about it.

2) Ask your self two questions: a) How often is the functions going to run? and b) How many times is it going to run in quick succession, with no breaks in between (i.e. is it going to run in a loop)?

In your example of an calling a function every time the user types a key, you can base the decision about "how long is too long to finish a function" on how often the user is going to hit a key. Again, if it doesn't fuss with the user's ability to use your application effectively, you're wasting time haggling over 3 milliseconds.

On the other hand, if your function is going to get called inside a loop that runs 100,000 times, then you definitely want to spend some time making it as lean as possible.

3) Utilize promises where appropriate. Javascript promises are a nice feature (although they're not unique to javascript). Promises are way of saying, "listen, I don't know how long this function is going to take. Ima go start working on it, and I'll just let you know whenever I'm finished." Then, the rest of your code can keep on executing and whenever the promise is fulfilled, you are notified, and can do something at that point.

The most common example of promises is the AJAX design pattern. You never know how long a call to the back end might take, so you just say "alright, let me know when the backend responds with some useful information".

Read more about promises here

4) Benchmarking for fun and profit As mentioned above, sometimes you really do need to shave off every last miserable millisecond. In those cases I've found jsperf.com to be very useful. They have made it easy to quickly test two pieces of code to see which one runs faster. There are tons of other benchmarking tools, but I like that one a lot. (I realize this is a bit of a tangent to the original post, but at some point somebody else is going to come a long and read this, so I feel it's relevant)

After all is said and done, remember to keep relating this question to the user. Optimize for the user, not for your ego, or for some arbitrary number.

cutmancometh
  • 1,677
  • 3
  • 20
  • 28