0

I have a table in PostgreSQL. I want to perform element-wise addition for the arrays(i.e. col) after grouping them by time. The arrays are of varying dimensions. Example as follows:

| time  | col               |
|------ |------------------ |
| 1     | {1,2}             |
| 1     | {3,4,5,6}         |
| 2     | {}                |
| 2     | {7}               |
| 2     | {8,9,10}          |
| 3     | {11,12,13,14,15}  |

The result should be as follows:

| time  | col               |
|------ |------------------ |
| 1     | {4,6,5,6}         |
| 2     | {15,9,10}         |
| 3     | {11,12,13,14,15}  |

I found this answer which does something similar for fixed dimension of 3 elements. But it does not work for varying dimensions.

DBFiddle for example.

mrtyormaa
  • 862
  • 1
  • 7
  • 17
  • 1
    This would be so much simpler with a properly normalized model. This is not the way arrays are meant to be used. –  Jul 02 '18 at 20:30
  • Normalize your data! Use a detail table to keep your variable sized data like arrays or look into mongodb – Jose Munoz Jul 02 '18 at 20:33
  • @a_horse_with_no_name What is the disadvantage of using arrays like this? I will read up more on normalizing the database. It will be helpful if you can provide me some good link for the same. – mrtyormaa Jul 02 '18 at 20:38
  • @DavidMunoz mongodb is not an option. I have to stick to PostgreSQL. I was under the impression that array operations are fast. If this is not true, I can use json objects or strings too. My final goal is to get a pair(s). Example, "0:300, 1:32, 3:0, 4:0, ..., 15:76 .." – mrtyormaa Jul 02 '18 at 20:41
  • 1
    [To quote the manual](https://www.postgresql.org/docs/current/static/arrays.html): "*Arrays are not sets*" - everytime you find yourself manipulating individual elements of an array or searching for specific elements in an array you are doing something wrong. You should treat an array as a "atomic value". The same you treat a column `firstname` as a single value. And before you de-normalize even more and mis-use JSON, you might want to read this: http://blog.2ndquadrant.com/postgresql-anti-patterns-unnecessary-jsonhstore-dynamic-columns/ –  Jul 02 '18 at 20:42

1 Answers1

2

You could use unnest with ordinality:

SELECT time, array_agg(elem order by nr)
FROM (SELECT time, nr, SUM(elem) AS elem
      FROM tab,unnest(col) WITH ORDINALITY a(elem, nr)
      GROUP BY time,nr) s
GROUP BY time;

DBFiddle Demo

Output:

┌──────┬──────────────────┐
│ time │    array_agg     │
├──────┼──────────────────┤
│    1 │ {4,6,5,6}        │
│    2 │ {15,9,10}        │
│    3 │ {11,12,13,14,15} │
└──────┴──────────────────┘
Lukasz Szozda
  • 162,964
  • 23
  • 234
  • 275