-1

thanks for checking my question. I have an array of array, as you can see below, what i want to do is some kind of loop, map or foreach, and end up with a unique category, and all the values belonging to that category added up

This is what i have

0: Array [ "Repeated Value", 100 ]
​
1: Array [ "Repeated Value", 200 ]
​
2: Array [ "Repeated Value", 150 ]
​
3: Array [ "Unique Value 1 ", 90 ]
​
4: Array [ "Unique Value 2", 32.5 ]

This is what i want in the outcome

​
0: Array [ "Unique Value 0", 450 ]
​
1: Array [ "Unique Value 1 ", 90 ]
​
2: Array [ "Unique Value 2", 32.5 ]

3 Answers3

0

what you're probably looking for is reduce.

// your initial array
const array1 = [
  ['one', 10],
  ['one', 10],
  ['one', 10],
  ['two', 10],
  ['two', 10],
  ['three', 10],
];

// an intermediate object keyed by your unique column
// e.g. { 'one': ['one', 30], 'two': ['two', 20], 'three': ['three', 10] }
const uniqueObject = array1.reduce((a, c) => {
    if(a[c[0]] === undefined) { 
        a[c[0]] = c;
    } else {
        a[c[0]][1] += c[1];
    }
}, {});

// map object keys back to their array values to get your unique array
const array2 = Object.keys(uniqueObject).map(k => uniqueObject[k]);

this approach may seem like overkill, but it will account for manipulating data rows of any size and reducing them into single rows based on a unique column. syntax is es6, so let me know if you need an older-style example

derelict
  • 2,044
  • 10
  • 15
  • thanks man, reduce was what i was looking for, the answer below was the one that worked perfect in my case, but anyway thanks for the great explanation – Sven Brodersen Feb 22 '19 at 21:54
  • be aware of the performance hit on using 'some' on every reduce iteration -- probably fine if your list is less than a couple thousand items, but really starts stacking up if working with a larger dataset – derelict Feb 22 '19 at 21:56
0

 [
    ['Repeated Value', 100],
    ['Repeated Value', 200],
    ['Repeated Value', 150],
    ['Unique Value 1 ', 90],
    ['Unique Value 2', 32.5]
].reduce((acc, curr) => {
    const found = acc.some(val => {
        if (val[0] === curr[0]) {
            val[1] = val[1] + curr[1];
            return true;
        }
        return false;
    });

    if (!found) {
        acc.push(curr);
    }
    return acc;
}, []);
abrahims
  • 16
  • 2
  • re-scanning the entire list on each iteration can become quite slow if you're de-duplicating a very large list; but a nice concise solution nonetheless – derelict Feb 22 '19 at 21:54
0

First create a list of unique keys, then map that list to the array you want using reduce on each iteration:

let arr = [
  ["a", 10],
  ["a", 10],
  ["a", 5],
  ["b", 11],
  ["c", 12]
]

let result = [...new Set(arr.map(x => x[0]))].map(y => [y, arr.reduce((acc, val) => acc += (val[0] === y) ? val[1] : 0, 0)]);

console.log(result);
connexo
  • 53,704
  • 14
  • 91
  • 128
  • scales very poorly for a large number of unique keys and/or large dataset; running reduce on a list of tens of thousands of items hundreds of times would crawl. points for a one-liner though ;) – derelict Feb 22 '19 at 22:11