I cannot make complete sense, of what you mean by duplicates and removing them. I think, what you try to achieve is, to aggregate the SKUBACODE and SKUQUANTITY into an array and add all related metadata to that object. Assuming, that each SKUBACODE is existing only once per ORDERCLIENTREF and all other metadata is invariant.
desired_output = json_array
.filter((group) => group != null && group.length > 0)
.map((group) => {
// create a hash map for sku
const skuMap = group.reduce((acc, curr) => acc.set(curr.SKUBARCODE, SKUQUANTITY), new Map());
const new_obj = group[0];
new_obj.SKUBARCODE = undefined;
new_obj.SKUQUANTITY = undefined;
// generate output of hash map
new_obj.ORDERLINES = [...skuMap].map(([code, quant]) {
return {
SKUBARCODE: code,
SKUQUANTITY: quant
};
});
return new_obj;
});