26

The question is in the title, but here is a longer explanation.

Long time ago I learned some nice javascript functions like reduce, filter, map and so on. I really liked them and started to use them frequently (they look stylish and I thought that because they are native functions they should be faster than my old for loops).

Recently I needed to perform some heavy js computations, so I decided to check how faster are they and to my surprise they are not faster, they are much much slower (from 3 to 25 times slower)

Also I have not checked for every function by here are my jsperf tests for:

So why are native functions are so much slower then old loops and what was the point of creating them if they are not doing anything better.

I assume that the speed loss is due to the invocation of the function inside of them, but still it does not justify such loss. Also I can not see why the code written with these functions is more readable, not to mention, that they are not supported in every browser.

Salvador Dali
  • 214,103
  • 147
  • 703
  • 753
  • It's an interesting question, but I'd suspect it's related to the browser internals (and a possible lack of optimisation due to their recent creation/implementation) rather than a code problem as such; I'm not voting to close (because I'm interested in whatever the answer might be), but this sort of potentially open-ended discussion *might* be better suited to [programmers.se]. – David Thomas Feb 13 '14 at 08:24
  • Interesting!!. Is it slow in all browsers? – Konza Feb 13 '14 at 08:32
  • I have only chrome in firefox (and it is slower there), but you can test it by yourself in IE or Opera – Salvador Dali Feb 13 '14 at 08:38
  • `.filter` - oops 10 (missed it's in FF) times slower: http://jsperf.com/filter-and-loop/4 – zerkms Feb 13 '14 at 08:45
  • It was simply not optimized yet in the JIT level, it has to create a closure each time even if it's not used. Eventually, it'll be optimized. – Benjamin Gruenbaum Apr 25 '14 at 08:46
  • @BenjaminGruenbaum can you please explain it further? What do you mean it will be optimized? These functions are there for a couple of years and nothing has changed. Why do you think it will be changed in the near future? – Salvador Dali Apr 25 '14 at 08:57
  • @SalvadorDali there is a lot of room to improve those functions' speed, it's jut not a big priority yet compared to other work tasks. – Benjamin Gruenbaum Apr 25 '14 at 09:03
  • @BenjaminGruenbaum are you a core dev of some of these browsers? I am just curious how do you know what are the priorities? – Salvador Dali Apr 25 '14 at 09:31
  • 1
    @SalvadorDali for one thing, you can go to (for example) the v8 bug tracker and look at the issue priorities, it's all open. The code base, and discussions around it are open too. As for a developer, I'm not a 'core' developer of any one of those browsers. While I happen to have friends who are hacking on different browsers, I'd like to emphasize that getting to those people isn't particularly hard, and you can easily (and are welcome to!) participate, and you're welcome to raise issues that bother you and help https://code.google.com/p/v8/wiki/Contributing . – Benjamin Gruenbaum Apr 25 '14 at 09:37

1 Answers1

2

I think at some point it comes down to the fact that these native functions are more sugar than they are optimizations.

It's not the same as say using Array.prototype.splice rather than looping over and doing it yourself where the implementation is obviously going to be able to do far more under the hood (in memory) than you yourself would be able to.

At some point in time with filter, reduce and map the browser is going to have to loop over your array and perform some operation on the value contained within it (just as you do with a loop). It can't reduce the amount it has to do to achieve the same ends (it's still looping and performing an operation) but it can give you a more pleasing API and provide error checking etc that will increase the time.

Dormouse
  • 5,130
  • 1
  • 26
  • 42
  • 1
    Syntactic sugar in the language is just a wrapper among another construction and therefore they can not have such big computational penalty. – Salvador Dali Feb 13 '14 at 08:49