Multiple SQL Statements in One Roundtrip Using Dapper.Net

Dapper and execution of multiple same commands?

When you pass IEnumerable to Execute for inserting multiple items, each item is inserted independently. That said, there is no much difference in the two scenarios you presented.

Dapper just facilitates the insertion of multiple items by accepting IEnumerable. It does not internally implement something like bulk insert.

Have a look at this article:

But this approach sends every command as a single, stand-alone transaction, which may cause inconsistencies in case of error while executing one or more statements. The workaround here is to use an IDBTransaction object to create an explicit transaction that covers all the executions. Performances and scalability will be worse than executing just one command passing an array of objects (due to network latency and thus longer transactions), but at least consistency will be guaranteed.

But since Dapper supports SQL Server’s Table-Valued-Parameters and also JSON my recommendation is to use one of those if you need to pass an array of values to a parameter. I’ll discuss about them in future articles, so stay tuned.

Now, what about if you have to pass an array of, say, 10.000 values or more? The right choice, here, is to use a bulk load, and more specifically with SQL Server the BULK INSERT command, which is, unfortunately, not supported by Dapper natively. The workaround is to just use the regular SqlBulkCopy class here and you’re done.

Alternatively, you may consider using tool like Dapper Plus - Bulk Insert.

You may read more about bulk insert and a hack to achieve this here. Also, Dapper supports TVPs. If your RDBMS does, use it.

To answer your comment:

If I create transaction, and invoke multiples inserts within it, does the transaction make 1 roundtrip to database to invoke all of them? Or inserts will be still executed one by one ?

Transaction have nothing to do with bulk insert. If you wrap your code block (any version you mention in question) in transaction, result will not change....well except one. Entire transaction will either commit or rollback. Without transaction, you may encounter consistency issues if operation fails in-between somewhere. By wrapping your code in transaction, you simply avoid this issue. Your core issue - Bulk Insert stays as is.

Execute multiple SQL commands in one round trip

The single multi-part command and the stored procedure options that you mention are the two options. You can't do them in such a way that they are "parallelized" on the db. However, both of those options does result in a single round trip, so you're good there. There's no way to send them more efficiently. In sql server 2005 onwards, a multi-part command that is fully parameterized is very efficient.

Edit: adding information on why cram into a single call.

Although you don't want to care too much about reducing calls, there can be legitimate reasons for this.

  • I once was limited to a crummy ODBC driver against a mainframe, and there was a 1.2 second overhead on each call! I'm serious. There were times when I crammed a little extra into my db calls. Not pretty.
  • You also might find yourself in a situation where you have to configure your sql queries somewhere, and you can't just make 3 calls: it has to be one. It shouldn't be that way, bad design, but it is. You do what you gotta do!
  • Sometimes of course it can be very good to encapsulate multiple steps in a stored procedure. Usually not for saving round trips though, but for tighter transactions, getting ID for new records, constraining for permissions, providing encapsulation, blah blah blah.

Dapper.NET and stored proc with multiple result sets

Have you tried the QueryMultiple method? It says it should:

Execute a command that returns
multiple result sets, and access each
in turn

You'll need to add this using statement to enable QueryMultiple .

using Dapper; /* to add extended method QueryMultiple public static GridReader QueryMultiple(this IDbConnection cnn, string sql, object param = null, IDbTransaction transaction = null, int? commandTimeout = null, CommandType? commandType = null); */

Update and query in single batch (one roundtrip) using Dapper

As @CharlieFace said in the comments, the following will work just fine:

conn.Query<int>(
"UPDATE Test SET N1 = 0; SELECT 1 AS n UNION ALL SELECT 2;"
).ToList().Dump();

Dapper multi insert returning inserted objects

to insert or update List of object with Dapper.Net you can't use Query

 connection.Query<Object>("your_query",your_list) 
//connection.Query<Object>: use to select IEnumrable<object> from db
//connection.QueryMultiple: use to execut multiple query at once then read result one by one

var sql =
@"
select * from Customers where CustomerId = @id
select * from Orders where CustomerId = @id
select * from Returns where CustomerId = @id";

using (var multi = connection.QueryMultiple(sql, new {id=selectedId}))
{
var customer = multi.Read<Customer>().Single();
var orders = multi.Read<Order>().ToList();
var returns = multi.Read<Return>().ToList();
...
}

you should use only Execute for multi insert or update

Execute("your_query",your_list, your_transaction);

so if you need to multi insert and return IDs for inserted records

// **using transaction depend on your needs**

//Example to multi insert and return full record

  string query = @"Insert Into _TableName ( _columns) 
OUTPUT INSERTED.*
values ( _parameters )"; //parameters should be same as object properties name to let dapper do correct mapping

[OUTPUT INSERTED.*] will return full insert row with id and you are free to return any property by replace asterisk with propertyname [OUTPUT INSERTED.Id] will return only id

// will be good for small list

 for (int i = 0; i < youList.Count-1; i++)
{
youList[i] = DbConnection.Query<object>(query, youList[i]).FirstOrDefault();
} // for loop is better for preformance

//for big List you can use SqlBulkCopy review this link here

Does the feature execute command multiple times result in multiple round-trips to database?

I finally got around to looking at this again. Looking at the source code (in \Dapper\SqlMapper.cs), I found the following snippet in method ExecuteImpl:

// ...
foreach (var obj in multiExec)
{
if (isFirst)
{
masterSql = cmd.CommandText;
isFirst = false;
identity = new Identity(command.CommandText, cmd.CommandType, cnn, null, obj.GetType(), null);
info = GetCacheInfo(identity, obj, command.AddToCache);
}
else
{
cmd.CommandText = masterSql; // because we do magic replaces on "in" etc
cmd.Parameters.Clear(); // current code is Add-tastic
}
info.ParamReader(cmd, obj);
total += cmd.ExecuteNonQuery();
}
// ...

The interesting part is on the second-last line where ExecuteNonQuery is called. That method is being called on each iteration of the for loop, so I guess it is not being batched in the sense of a set-based operation. Therefore, multiple round-trips are required. However, it is being batched in the sense that all operations are performed on the same connection, and within the same transaction if so specified.

The only way I can think of to do a set-based operation is to create a custom table-valued type (in the database) for the object of interest. Then, in the .NET code pass a DataTable object containing matching names and types as a command parameter. If there were a way to do this without having to create a table-valued type for every object, I'd love to hear about it.



Related Topics



Leave a reply



Submit