I have a function that needs to run on every keypress and I want to know if it is doing too much work. To find that out I would time it. If it is less than a millisecond I would have to iterate over it 1k - 1 million times then divide by that amount.
While this may seem broad I'm going to ask anyway, what is the max amount of time a function should take to run in 2015? And is there a standard?
In video games if you want to hit 30 frames per second then you have 33 milliseconds to play with since 1 second equals 1000 milliseconds and 1000ms/30frames per second equals 33milliseconds. If you aim for 60 frames per second you only have 15 milliseconds to work with. So at what point would you say to your fellow software developer that their function takes too long? After, 1 millisecond? 100 milliseconds?
For example, if you were to use JQuery to find an element on a page of a 1000 elements how long should that function take? Is 100 ms too long? Is 1 ms too long? What standards to people use to go by? I've heard that database administrators have a standard search time for queries.
Update:
I really don't know how to word this question the way I meant it to. The reason I ask is because I do not know if I need to optimize a function or just move on. And if I do time a function I don't know if it's fast or slow. So, I'm going to ignore it if it's less than a millisecond and work on it if it's longer than a millisecond.
Update 2:
I found a post that describes a scenario. Look at the numbers in this question. It takes 3ms to query the database. If I was a database administrator I would want to know if that is taking too long to scale to a million users. I would want to know how long it should take to connect to the database or perform a query before adding another server to help load balance.