0

Apologies if my title might not be clear, I'll explain the question further here.

What I would like to do is to have multiple inserts based on a JSON array that I (backend) will be receiving from the frontend. The JSON object has the following data:

//Sample JSON
{
 // Some other data here to insert
 ...
 "quests": {
     [
        {
        "player_id": [1, 2, 3],
        "task_id": [11, 12],
        },
        {
        "player_id": [4, 5, 6],
        "task_id": [13, 14, 15],
        }
    ]
}

Based on this JSON, this is my expected output upon being inserted in Table quests and processed by the backend:

//quests table (Output)
----------------------------
id |  player_id | task_id |
----------------------------
1 | 1           | 11      | 
2 | 1           | 12      | 
3 | 2           | 11      | 
4 | 2           | 12      | 
5 | 3           | 11      |
6 | 3           | 12      | 
7 | 4           | 13      | 
8 | 4           | 14      | 
9 | 4           | 15      | 
10| 5           | 13      | 
11| 5           | 14      | 
12| 5           | 15      | 
13| 6           | 13      | 
14| 6           | 14      | 
15| 6           | 15      | 
// Not sure if useful info, but I will be using the player_id as a join later on.

-- My current progress --

What I currently have (and tried) is to do multiple inserts by iterating each JSON object.

//The previous JSON response I accept:

{
  "quests: {
  [
   {
    "player_id": 1,
    "task_id": 11
   },
   {
    "player_id": 1,
    "task_id": 12
   },
   {
    "player_id": 6,
    "task_id": 15
   }
  ]   
 }
}
// My current backend code
db.tx(async t => {
        const q1 // some queries
        ....
        const q3 = await t.none(
            `INSERT INTO quests (
                player_id, task_id)
                SELECT player_id, task_id FROM
                json_to_recordset($1::json)
                AS x(player_id int, tasl_id int)`,[
                        JSON.stringify(quests)
                    ]);
        
        return t.batch([q1, q2, q3]);

    }).then(data => {
        // Success
    }).catch(error => {
        // Fail
    });
});

It works, but I think it's not good to have a long request body, which is why I'm wondering if it's possible to run iteration of the arrays inside the object.

If there are information needed, I'll edit again this post.

Thank you advance!

vitaly-t
  • 24,279
  • 15
  • 116
  • 138
TechArtificer
  • 97
  • 1
  • 2
  • 9
  • The table structure you show represents a denormalized database, which by itself is already a problem, due to lots of data duplication. Such an example would offer little to no value to other developers. Also, you do not even show what you have tried and what exact problems you faced. You are just asking for someone to give you a solution, which isn't a good way to ask a question on StackOverflow. – vitaly-t Feb 28 '21 at 18:19
  • @vitaly-t hello, sorry for my former question and negligence on my part. I have updated my question and provided additional information, hopefully I've got my idea across. I've also revised my tables (not shown here), thank you for the heads up! – TechArtificer Mar 01 '21 at 15:29
  • The answer would depend on how many records your inner `SELECT` produces. If it produces only one record, then [standard multi-row insert is preferable](https://stackoverflow.com/questions/37300997/multi-row-insert-with-pg-promise). But if it produces multiple records, then you will have to use explicit query constructs indeed, and whether it can be optimized into a single query - that's more of a pure-SQL question then, and not about `pg-promise` anymore ;) Modern PostgreSQL does have a powerful set of JSON-processing functions. – vitaly-t Mar 01 '21 at 21:44
  • I see, I'll look into explicit query constructs and read more about them. I've read about the standard multi-row inserts, although I haven't tried it yet (will do when I get home later!), erm, do pgp.helpers work inside transactions (tx)? – TechArtificer Mar 02 '21 at 02:34

0 Answers0