2

I've just started using Dapper and I've run into the following problem.

I want to insert a bunch of records, and return the inserted records alongside the auto-incremented id.

Using Postgres, I want to run the equivalent of this query:

INSERT INTO players (name) 
VALUES ('Player1'), ('Player2'), ('Player3'), ('Player4'), ('Player5')
RETURNING id, name;

Using Dapper to run this query on a list of players and serialise back into a list of players (with the ids) I thought I could do this:

public class Player
{
    public int Id { get; set; }
    public string Name { get; set; }
}

var players = new List<Player> { new Player { Name = "Player1" }, new Player { Name = "Player2" }, new Player { Name = "Player3" }, new Player { Name = "Player4" }, new Player { Name = "Player5" }}

connection.Query<Player>("INSERT INTO players (name) VALUES (@Name) \r\n" + 
    "RETURNING id, name, tag;", 
    players);

This throws the following error (it's a list of players each with a name):

Parameter '@Name' referenced in SQL but not found in parameter list

I believe that Query() may not support lists of parameters, so I tried connection.Execute() instead. Execute works, but obviously it doesn't return back the inserted players with their Ids.

It is worth noting that I can do an INSERT and RETURNING like this when I insert only one value.

Does anyone know how I can do INSERT and RETURNING for multiple values like this with Dapper?

Update

I have this (somewhat dirty) solution:

        var sb = new StringBuilder();
        sb.Append("INSERT INTO players (name) VALUES \r\n");
        var parameters = new ExpandoObject() as IDictionary<string, object>;

        var values = new List<string>();
        for (int i = 0; i < players.Count; i++)
        {
            var p = players[i];

            values.Add($"(@Name{i})");
            parameters[$"Name{i}"] = p.Name;
        }

        sb.Append(string.Join(", \r\n", values));

        sb.Append(" \r\nRETURNING id, name, tag;");

        // parameters = { Name1 = "Player1", Name2 = "Player2, ... etc} 

        var ret = connection.Query<Player>(sb.ToString(), parameters);

So building an ExpandoObject with properties from my Players and then passing that into Dapper Query(). It works, but it seems pretty dirty. Any suggestions on how to improve this?

janderson
  • 963
  • 4
  • 14
  • 26
  • I'm not familiar with postgresql so can't comment on that actual query, but to return multiple result sets you need to use `connection.QueryMultiple()`. Here's a [similar question](http://stackoverflow.com/questions/19337468/multiple-sql-statements-in-one-roundtrip-using-dapper-net) – markpsmith Nov 11 '15 at 10:43
  • @markpsmith Thanks for your suggestion. QueryMultiple is for multiple SELECTs as it is in the linked question. I don't think it applies here. – janderson Nov 11 '15 at 10:45

1 Answers1

1

Firstly, it should be noted that passing a List<Player> to the Execute method as the outermost parameter is essentially the same as:

foreach(var player in players)
    connection.Execute(
         "INSERT INTO players (name) VALUES (@Name) \r\n" + 
         "RETURNING id, name, tag;", player);

Dapper just unrolls it for you (unless it is a very specific async scenario where it can pipeline the commands). Dapper does support list-parameter expansion, but this is for leaf-level values, and was constructed for in (...) usage, so the syntax would not come out quite as you want; as an example:

DateTime dateStart = ...
int[] custIds = ...
var orders = conn.Query<Order>(@"
    select * from Order
    where OrderDate >= @dateStart and CustomerId in @custIds",
    new { dateStart, custIds }).AsList();

which becomes the SQL:

select * from Order
where OrderDate >= @dateStart and CustomerId in (@custIds0, @custIds1, ...)

(depending on the number of items in the array)

Your expected usage is one that has been suggested and discussed quite a bit recently; at the current time it isn't supported - the loop unrolling only works for Execute, however, it is looking increasingly likely that we will add something here. The tricky bit is in deciding what the correct behavior is, and whether it is expected that this would essentially concatenate the results of multiple separate operations.

However; to do what you want via LINQ:

var results = players.SelectMany(
    player => connection.Query<Player>("...", player)).AsList();

This is the same "unroll the loop and concatenate the results" behavior, except it should work.

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
  • Thanks for your answer. Isn't there extra overhead in calling multiple queries like this? Can this be fixed by putting it in a transaction? – janderson Nov 11 '15 at 11:50
  • Or is there a way you would suggest that I parametrise the insert query myself and pass it into a dapper Query()? – janderson Nov 11 '15 at 11:53
  • @janderson on overhead: sure, there's a round-trip (latency) cost; the internals do a lot to reuse things like the command instance, etc. Putting it in a transaction does nothing except add a few extra levels of overhead. The fundamental issue here is that you can't blindly concatenate SQL - if it declares variables or creates temporary tables / table variables: it will break.\ – Marc Gravell Nov 11 '15 at 12:11
  • @janderson the pipeline option (when available) removes the latency cost, obviously. An interesting idea is whether we could provide some expansion syntax / pattern that *would work correctly* for the multi-row case. However, I wonder whether this scenario is better served by "table valued parameters". This then immediately becomes provider / RDBMS specific, however. Does Postgres support table valued parameters? If so: we can fix it fairly trivially. – Marc Gravell Nov 11 '15 at 12:12
  • @janderson oddly enough, the problematic case for expansion syntax is usually the zero-rows case – Marc Gravell Nov 11 '15 at 12:13
  • Check the Update. I got it working that way but it seems quite hacky. – janderson Nov 11 '15 at 12:17