0

I'm trying to figure out a robust way to calculate the average of object properties in case there are too many to be specified explicitly by their name.

I found this nice gist that helps in case we have just a few properties:

const someData = 
[   
    { height: 176, weight: 87 },
    { height: 190, weight: 103 },
    { height: 180, weight: 98 } 
]

// for height
var sumHeight = (prev, cur) => ({height: prev.height + cur.height});
var avgHeight = someData.reduce(sumHeight).height / someData.length;
console.log(avgHeight); // => gives 182

// for weight
var sumWeight = (prev, cur) => ({weight: prev.weight + cur.weight});
var avgWeight = someData.reduce(sumWeight).weight / someData.length;
console.log(avgWeight); // => gives 96

But this method is limited in terms of scaling if we have lots of properties, for example:

const someDataExtended = 
[   
    { height: 176, weight: 87, salary: 100000, age: 20, numOfCats: 2 },
    { height: 190, weight: 103, salary: 100050, age: 40, numOfCats: 0 },
    { height: 180, weight: 98, salary: 20345, age: 50, numOfCats: 1 } 
]

How can I average across all properties without specifying them by name? Ideally, I'd like to map over someDataExtended without mutating the initial data, but rather generate a summary such as:

const finalSummary = {
    height: 182,
    weight: 96,
    salary: 73465,
    numOfCats: 1
}
Emman
  • 3,695
  • 2
  • 20
  • 44
  • This sounds like premature optimisation. How many objects are you talking about? – Andy Feb 13 '22 at 13:57
  • FWIW, I didn't think much of the existing answer to the previous question, so [I've added one](https://stackoverflow.com/a/71101385/157247). – T.J. Crowder Feb 13 '22 at 14:04
  • @Andy, why would you say this seems like a premature optimization? In my real data I have an array with 1000 objects, each of which contains 20 properties. Here I just included a minimal example to communicate the gist of the problem. – Emman Feb 13 '22 at 14:41

0 Answers0