29

I wonder if there is a better way of generating a better performing solution for partial sums of an array.

Given an array say x = [ 0, 1, 2, 3, 4, 5 ], I generated sub-arrays of the items, and then computed the sum of each array which gives:

[ 0, 1, 3, 6, 10, 15 ]

So the full code is:

x.map((y,i)=>x.filter((t,j)=>j<=i))
 .map(ii=>ii.reduce((x,y)=>x+y,0))

I wonder if flat map or some other array method will have a solution that does not require expanding each subarray.

qwr
  • 9,525
  • 5
  • 58
  • 102
  • 4
    By the way, this is called *prefix-sum* or *scan*. Many collections frameworks have it in the standard library, but unfortunately ECMAScript does not. – Jörg W Mittag Jun 10 '19 at 00:37

11 Answers11

23

Much, by keeping a running total:

function* partialSums(iterable) {
    let s = 0;

    for (const x of iterable) {
        s += x;
        yield s;
    }
}

const x = [0, 1, 2, 3, 4, 5];
console.log(Array.from(partialSums(x)).join(', '));

Linear time, online. (You can also produce an array directly; expand below.)

const partialSums = arr => {
    let s = 0;
    return arr.map(x => s += x);
};

const x = [0, 1, 2, 3, 4, 5];
console.log(partialSums(x).join(', '));
Ry-
  • 218,210
  • 55
  • 464
  • 476
  • 6
    Urgh... `yield` is such a weak word. Add each item to an array and **`return`** it like a man. – Malekai Jun 10 '19 at 15:13
  • 1
    @LogicalBranch: That wouldn’t be online, so you may as well use the second code snippet then. – Ry- Jun 10 '19 at 15:14
  • 3
    @LogicalBranch Why? Iterators FTW in 2019, no? – spender Jun 10 '19 at 19:20
  • @LogicalBranch Iterators are the way to go when doing partial sums if you have a large array or you only needs up to a certain value. Why do you dislike yield? I know it's not as in your face as return but it is descriptive of what it's doing. – HSchmale Jun 11 '19 at 02:28
  • 2
    This is a nice imperative answer, However, it doesn't fit a question tagged with functional programming. This is merely a 1:1 alloction of a specific helper function to a very specific problem. Zero generalization, nothing learned. –  Jun 11 '19 at 11:26
  • @bob: It generalizes well to `scan` and it’s FP because it’s a function. What’s learned: cargo cult FP by shoehorning any task into the magic array functions and avoiding all mutation no matter how local in JavaScript is harmful. – Ry- Jun 11 '19 at 15:16
  • @Ry- I just knew there would be a concise approach that I was missing and you hit it. Thanks aplenty. – Patrick Francis Omogbeme Jun 11 '19 at 17:04
  • @Ry- Umm, so in the In the end you wind up with `scan`, which is implemented in another answer to this very question and which you happen to downvote I guess, as you downvoted at least another answer. –  Jun 11 '19 at 19:14
  • @bob: I removed my downvote after the non-quadratic-time version was added (even though it isn’t functional by your definition). But yes, the nice way to end up with `scan` is by replacing `s += x` with `acc = fn(acc, x)` in this generator function. – Ry- Jun 11 '19 at 19:22
  • @Ry- Cargo cult, what a nice academic term. However, no one seriously says local mutations are harmful. No one seriously says there is purity in referential-identity-Javascript. I would even claim that `Array.prototype.concat` is one of the most harmful methods in JS, since it pretends as if `Array`s are persistant data structures. There is a lot justified criticism on FP in JS. So please chime in, justified criticism is always appreciated. –  Jun 11 '19 at 19:25
  • @bob: So if you agree that repeated `concat` (and equivalently, spread) are harmful ways to pretend one’s being more functional while making complicated solutions to problems that can be solved by creating `flatMap` or `scan`, and that local mutations are good ways to implement necessary functions in practical JavaScript… what is it that you take issue with? – Ry- Jun 11 '19 at 19:30
6

You can simply use a for loop with a variable to keep track of the last sum

let x = [ 0, 1, 2, 3, 4, 5 ]

let sum = (arr) => {
  let sum = 0
  let final = []
  for(let i=0; i<arr.length; i++){
    sum+= arr[i]
    final.push(sum)
  }
  return final
}

console.log(sum(x))

You can also use map:

let x = [0, 1, 2, 3, 4, 5]

let sum = (arr) => {
  let sum = 0
  return arr.map(current => sum += current )
}

console.log(sum(x))
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Code Maniac
  • 37,143
  • 5
  • 39
  • 60
  • That second code snippet looks strikingly familiar, down to the fact that it’s hidden. – Ry- Jun 10 '19 at 15:15
  • @Ry- and the above line states `you can also use map`, all i wanted to show both the conventional and functional approach, if people feels it's a valid reason to down vote i am happy with it :) – Code Maniac Jun 10 '19 at 16:13
6

The flat map won't be useful in your case, because you're not trying to flatten your partial results coming as lists, but we can probably try to resolve your problem in a single reduce:

[0, 1, 2, 3, 4, 5]
.reduce(
   ([arr, sum], el) => { // We pass along array and running sum
       const next = sum + el
       return [[...arr, next], next]
   },
   [[], 0] // We need to seed our reduce with empty array and accumulator for calculating running sum
)[0] // Array containing array and the last sum is returned, so we need to take only the first element

It also iterates the array only once, so it might be a little more performant, than a solution creating slices and then summing them.

Or a version with array.push, which reuses same array:

[0, 1, 2, 3, 4, 5]
.reduce(
   ([arr, sum], el) => { // We pass along array and running sum
       const next = sum + el
       arr.push(next)
       return [arr, next]
   },
   [[], 0] // We need to seed our reduce with empty array and accumulator for calculating running sum
)[0] 
Krzysztof Atłasik
  • 21,985
  • 6
  • 54
  • 76
  • 1
    This will be equally slow (O(n²)) since it copies the array each time it wants to add something to it. – Ry- Jun 10 '19 at 15:15
  • 1
    @Ry- you can easily replace it with `push`, but it will be mutating array, which would make it not purely functional. Have you seen tag `functional-programming` on question? – Krzysztof Atłasik Jun 10 '19 at 15:48
  • 2
    Nothing is purely functional. You own the array. It’s completely fine. – Ry- Jun 10 '19 at 15:51
  • 1
    @Ry- Not really, you can have purely functional methods. But as you mentioned, a version using `push` is referentially transparent, so it's also fine. Edited my answer to include the second version. – Krzysztof Atłasik Jun 10 '19 at 15:54
5

Below, scan takes a mapping function f and an initial accumulator r -

const scan = (f, r, [ x, ...xs ]) =>
  x === undefined
    ? [ r ]
    : [ r, ...scan (f, f (r, x), xs) ]
  
const add = (x, y) =>
  x + y

const print = (...vs) =>
  vs .forEach (v => console .log (v))

const data =
  [ 0, 1, 2, 3, 4, 5 ]
  
print
  ( scan (add, 0, data)
  , scan (Math.max, 3, data)
  , scan (add, 0, [])
  )

// [ 0, 0, 1, 3, 6, 10, 15 ]
// [ 3, 3, 3, 3, 3, 4, 5 ]
// [ 0 ]

If you need a program that does not take an initial accumulator, the first element of the input array can be used instead. This variation is called scan1 -

const scan = (f, r, [ x, ...xs ]) =>
  x === undefined
    ? [ r ]
    : [ r, ...scan (f, f (r, x), xs) ]
    
const scan1 = (f, [ x, ...xs ]) =>
  x === undefined
    ? []
    : scan (f, x, xs)

const add = (x, y) =>
  x + y
  
const print = (...vs) =>
  vs .forEach (v => console .log (v))

const data =
  [ 0, 1, 2, 3, 4, 5 ]

print
  ( scan1 (add, data)
  , scan1 (Math.max, data)
  , scan1 (Math.min, data)
  , scan1 (add, [])
  )
  
// [ 0, 1, 3, 6, 10, 15 ]
// [ 0, 1, 2, 3, 4, 5 ]
// [ 0, 0, 0, 0, 0, 0 ]
// []

Performance optimisations can be made and stack-overflow woes can be remedied, if necessary, all without sacrificing functional style -

const scan = (f, init, xs) =>
  loop
    ( ( r = []
      , a = init
      , i = 0
      ) =>
        i >= xs.length
          ? push (a, r)
          : recur
              ( push (a, r)
              , f (a, xs[i])
              , i + 1
              )
    )

Now let's run it with a big input -

// BIG data!
const data =
  Array .from (Array (10000), (_, x) => x)

// fast and stack-safe
console .time ("scan")
const result = scan (add, 0, data)
console .timeEnd ("scan")
// scan: 8.07 ms

console .log (result)
// [ 0, 0, 1, 3, 6, 10, 15, ..., 49985001 ]

This depends on the following generic functional procedures -

const recur = (...values) =>
  ({ recur, values })

const loop = f =>
{ let r = f ()
  while (r && r.recur === recur)
    r = f (...r.values)
  return r
}

const push = (x, xs) =>
  ( xs .push (x)
  , xs
  )

Expand the snippet below to verify the results in your own browser -

const recur = (...values) =>
  ({ recur, values })

const loop = f =>
{ let r = f ()
  while (r && r.recur === recur)
    r = f (...r.values)
  return r
}

const push = (x, xs) =>
  ( xs .push (x)
  , xs
  )

const scan = (f, init, xs) =>
  loop
    ( ( r = []
      , a = init
      , i = 0
      ) =>
        i >= xs.length
          ? push (a, r)
          : recur
              ( push (a, r)
              , f (a, xs[i])
              , i + 1
              )
    )

const add = (x, y) =>
  x + y

const data =
  Array .from (Array (10000), (_, x) => x)
  
console .time ("scan")
const result = scan (add, 0, data)
console .timeEnd ("scan")

console .log (result)
// [ 0, 0, 1, 3, 6, 10, 15, ..., 49995000 ]
Mulan
  • 129,518
  • 31
  • 228
  • 259
  • 3
    `scan` is good. Implementing `scan` in JavaScript as if it were Haskell is not good, because it gives you quadratic time complexity and stack overflows on arrays that aren’t even particularly big. – Ry- Jun 10 '19 at 15:48
  • 2
    @Ry-, Thanks but I don't see the quadratic time complexity. The intermediate arrays are definitely wasteful, but the simple programs show precisely what needs to be done. Optimisation with this kind of program is easy so I usually leave them for the end-user who knows their requirements best. An update to the answer shows a stack-safe `scan`. Your comment is appreciated. – Mulan Jun 10 '19 at 16:34
  • 1
    `[ r, ...scan (f, f (r, x), xs) ]` spread – copy of 0 + 1 + … + n−1 items, quadratic. `[ x, ...xs ]` destructuring – copy of n−1 + n−2 + … + 0 items, quadratic. Thanks for adding the other version. – Ry- Jun 10 '19 at 16:41
  • That's more like `O(n*(n+1)/2)`, which is almost half the complexity of `O(n^2)`, yeah? The optimised revision runs in `O(n)` though. – Mulan Jun 10 '19 at 16:48
  • 6
    Yes, but O(n*(n+1)/2) = O(n²). The “half” is a constant factor. – Ry- Jun 10 '19 at 16:50
  • Of course the universality of `reduce` means that we can also choose to implement it on `reduce`. One way: `const scan2 = (f, init, xs) => xs .reduce ( (as, x) => [ ...as, f (as.slice(-1)[0], x) ], [init] ) .slice (1) ` – Scott Sauyet Jun 10 '19 at 16:52
  • Another way to remove that additional time complexity is the replace `[ r, ...scan (...) ]` with `[r .concat (...)]`, assuming that `concat` is an atomic operation – Scott Sauyet Jun 10 '19 at 17:14
  • @ScottSauyet you can avoid those intermediate arrays and right-side lookups by using a compound accumulator in `reduce`; ie, using `push` from the update, `scan2 = (f, init, xs) => push (...xs.reduce (([ r, a ], x) => [ f (r, x), push (r, a) ], [ init, [] ]))`. I think JavaScript compilers could easily optimize `[a, ...b]` to `[a].concat(b)` but the underlying creation of a _new_ array still happens. For significantly large inputs, `concat` will always perform slower than `push` in this program. Thanks for the comment :D – Mulan Jun 10 '19 at 17:20
  • @ScottSauyet I'm not an expert on all of the recursion schemes. I first saw this as an anamorphism but someone recently introduced me to *apomorphism* and it might be more applicable here. See comments on [this Q&A](https://stackoverflow.com/a/51530644/633183). More research requires more time! -_- – Mulan Jun 10 '19 at 17:23
  • 1
    @ScottSauyet: `[r, ...s]` and `[r].concat(s)` are the same in that respect. – Ry- Jun 10 '19 at 19:07
  • @Ian, I appreciate your efforts but the added syntax is seen as noisy and distracting. I understand many JavaScript programmers are numb to this and may find the style in my answers unfamiliar. This style has been adapted to improve semantic consistency and increase comprehension but it takes time to undo the old ways of thinking. I invite you to read some of my [other answers](https://stackoverflow.com/users/633183/user633183?tab=answers&sort=newest) to see the style applied in various ways. If you'd like to post an ES5 version of my answer using your syntax styles, you are more than welcome. – Mulan Jun 10 '19 at 19:30
  • @Ry-: You're right. In fact, in my small test (https://jsperf.com/another-spread-vs-concat), the spread syntax was 50% faster than `concat`. OTOH, with the speed of either of these, it seems likely that the `O(n^2)` is likely to be an actual impediment in many reasonable cases... in which case, I vote for clean code. – Scott Sauyet Jun 10 '19 at 20:13
  • @ScottSauyet: Is one of those supposed to be “unlikely”…? – Ry- Jun 10 '19 at 20:57
  • @ScottSauyet, I admit that's surprising, especially because I over-simplified in my original comment. `[a, ...b]` is more like `[a].concat(Array.from(b))` because spread argument accepts any iterable. We'd want to look at this with larger array inputs because big inputs are the reason we would optimise in the first place. Thanks everyone for discussion :) – Mulan Jun 10 '19 at 21:08
  • @user633183 ES6 features are a great improvement to the language and should definitely be used, but this style is effectively implementing those features to such a manner that it obfuscates your code. Arrow functions, for example, have a specific intended purpose as callbacks, and they improve readability and syntax when used in that manner. Explicit function declarations still *also* have a purpose. Just because something *can* be shorter doesn't mean it *should*. Regardless it's your answer/your choice so I'll just respectfully disagree and sit aside. – Ian Jun 11 '19 at 14:15
  • @bob: Discussing about performance is meaningless in a question that specifically asks for a better-performing solution? Interesting take. (I’m also sorry that my outright and specific disagreement is being interpreted as *passive*.) – Ry- Jun 11 '19 at 17:18
  • @bob: I don't see any particularly aggressive comments on any of the answers here. I found the discussions here quite interesting, and I like the variety of answers. – Scott Sauyet Jun 12 '19 at 16:54
  • @ScottSauyet I don't want to open this again, but when you look at his answer it is the less general version of this one. So a justified criticism would've been "please don't use `concat`/`spread` for arrays containing lots of data, because it is incredibly inefficient. But downvoting this answer and waltzing in with the bog-o hammer without prior attempt to settle this isn't the right way. Please note that another user felt encouraged to change this answer extensively, because the discussion provided a breeding ground for it. Anyway, thank you for getting me to reconsider my actions, Scott. –  Jun 12 '19 at 18:39
  • @bob: wow, I hadn't seen Ian's drastic and destructive edit to this answer. I imagine there might have been (now-deleted) comments regarding that. If there were, I didn't see them either, although I did wonder about the comment directed at Ian by user633183. – Scott Sauyet Jun 12 '19 at 18:46
  • Ian never commented, I just @'d Ian to remark on the edit that was made. I am still learning wrt to computational complexity and the note from Ry was helpful, to me at least. SO is an open forum and I welcome everyone's comments *and* criticisms. – Mulan Jun 12 '19 at 19:57
5

You just need to add in every step the current value to the previous result, so you could use a simple reduce.

const array = [0, 1, 2, 3, 4, 5, 6];

const sums = array.reduce((acc,current,index) => {
  const prev = acc.length ? acc[index-1] : 0;
  acc.push(prev + current);
  return acc;
},[]);

console.log(sums.toString());
Pablo Lozano
  • 10,122
  • 2
  • 38
  • 59
4

If you asking is there a faster or more efficient way then the other answers are sufficient.

However, I'd argue that something similar to your current solution is easier to read and more declarative if we phrase it as a mapping function.

Specifically something like "Map each value to itself plus all the previous values in the array".

You could use a filter, as you have done in your question, but i think a slice is clearer.

const x = [ 0, 1, 2, 3, 4, 5 ];

// A common generic helper function
const sum = (acc, val) => acc + val

const sums = x.map((val, i, self) => val + self.slice(0, i).reduce(sum, 0))
Calin Leafshade
  • 1,195
  • 1
  • 12
  • 22
  • The code uses the same approach but I don't think that answer explains the value of the approach. Answers should be more than just the code. – Calin Leafshade Jun 10 '19 at 16:20
2

It's possible to use map directly, if you keep an external accumulator variable:

const x = [ 0, 1, 2, 3, 4, 5 ];

let acc = 0;
const prefixSum = x.map(x => acc += x);

console.log(prefixSum);
Joe23
  • 5,683
  • 3
  • 25
  • 23
  • I upvoted. I guess those who downvoted are not familiar with `assignment expression`, not comfortable with `x => acc += x`. – LiuXiMin Jul 11 '19 at 00:21
1

One option is to use a single .map which uses .reduce inside to sum up the sliced partial array:

const x = [0, 1, 2, 3, 4, 5];

const sum = (x, y) => x + y;
const partialSums = x.map((_, i, arr) => arr.slice(0, i + 1).reduce(sum));
console.log(partialSums);
CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
0

A way can be to use for each and then slice the arrays to get the elements one by one and then sum them all by array.reduce You can do this like

let x = [0, 1, 2, 3, 4, 5]
let sum = []
x.forEach((_, index) => {
  index++;
  sum.push(x.slice(0, index).reduce((a, b) => a + b))
})
console.log(sum)

We are getting [0] then [0,1] then [0,1,2] then [0,1,2,3] and via [0,1,2].reduce((a, b) => a + b)) we get 3. Just push that to a new array. Which is your answer.

We can go even shorter by doing this. To me, this seems a very optimized solution.

let ar = [0, 1, 2, 3, 4, 5]
let s = 0
let arr = []
ar.forEach((n, i) => arr.push(s += n))
console.log(arr)

Or with .map, you can

let ar = [0, 1, 2, 3, 4, 5], s = 0
console.log(ar.map((n, i) => s += n))
weegee
  • 3,256
  • 2
  • 18
  • 32
  • *“To me, this seems a very optimized solution.”* It isn’t. *“You can also use `.map` instead of `forEach` like”* But why? – Ry- Jun 10 '19 at 18:27
  • Also, my last solution is better than yours http://jsben.ch/BAg36 You can even check it there @Ry- – weegee Jun 10 '19 at 18:47
  • That's total cheating, You are taking long inputs for my test and shorter for yours. Look at this now. The same inputs http://jsben.ch/PzJ8s @Ry- – weegee Jun 10 '19 at 18:52
  • Sorry, leftover. You’ll notice it doesn’t change the outcome. – Ry- Jun 10 '19 at 18:56
  • Well, it does. Run the test multiple times @Ry- It turns out that the time is dynamic and varies for arbitrary attempts. (although, most of the times, my code block comes first) – weegee Jun 10 '19 at 18:58
  • No matter how many times I run it, yours never comes first. This is only tangentially relevant to my original comment, though (in that discarding `.map`’s return value wastes time as well). – Ry- Jun 10 '19 at 19:02
  • What I said is that the time is dynamic and varies for arbitrary attempts. Here, my solution always comes first. Your solution comes first negligible times only. @Ry- – weegee Jun 10 '19 at 19:05
  • Also, there's no difference in my last approach and yours. Comparing the time is useless there. @Ry- – weegee Jun 10 '19 at 19:11
  • 1
    There’s a difference in that it pointlessly uses `map` for a push side effect and discards the result. (Well, I guess it’s not pointless, because it means you’re not technically duplicating my answer.) – Ry- Jun 10 '19 at 19:14
  • Also, .map is faster than .forEach. That's why I used .map most of the times. @Ry- – weegee Jun 10 '19 at 19:23
  • You mean faster to type? ’cause it’s not faster to run. It makes a new array. That’s what it’s for. – Ry- Jun 10 '19 at 19:26
  • @Ry- yes it is for that. Read [this amazing article](https://medium.com/@JeffLombardJr/understanding-foreach-map-filter-and-find-in-javascript-f91da93b9f2c) and I quote from it "_In reality, you shouldn’t be using .forEach() unless other methods such as .map() can’t do the trick. .map() is actually slightly faster than .forEach()_" – weegee Jun 10 '19 at 19:29
  • You misinterpreted it. The article is talking about `.map` used correctly, i.e. `arr.map(x => x * 2)` vs. `let result = []; arr.forEach(x => { result.push(x * 2) })`. `let result = []; arr.map(x => { result.push(x * 2) })` is just the worst of both worlds. – Ry- Jun 10 '19 at 19:30
  • I agree but `.map` is faster and that's the only case I'm using it for. I will use `.map` where I can, to reduce code times a max as possible. Also, anyone can use it however they want right? @Ry- – weegee Jun 10 '19 at 19:37
  • 1
    No, I just explained why it’s not faster in the way you seem to think that it is. Try a benchmark. Sure, you can use it anyway, and it’ll be bad practice anyway. – Ry- Jun 10 '19 at 19:50
  • 1
    @Ry- okay, I'll accept your approach as I'm a junior and .map returns an array which can also lead to some array waste. Edited my answer – weegee Jun 10 '19 at 19:57
0

The simplest answer is a one-liner: if x is [0,1,2,3,4,5]

x.map(i=>s+=i, s=0)
-1

Here is a simple answer using a recursive function.

var array = [ 0, 1, 2, 3, 4, 5 ];

function sumArray(arrayToSum, index){
    if(index < arrayToSum.length-1){
        arrayToSum[index+1] = arrayToSum[index] + arrayToSum[index+1];
        return sumArray(arrayToSum, index+1);
  }else
    return arrayToSum;

}
sumArray(array, 0);

console.log(array);
Orionis
  • 34
  • 1
  • 3
  • 2
    This has zero advantages and requires the JIT-compiler to optimize away the recursion, otherwise it could easily overflow the call stack for large arrays. Also it stores the running total to the array and reads it back, instead of keeping it in a variable. – Peter Cordes Jun 10 '19 at 18:19