Sql Server Maximum Rows That Can Be Inserted in a Single Insert Statment

How many values are allowed in a single INSERT INTO statement?

Based on my testing, the limit is 1000 rows. Just tried inserting many rows and came up with this error:

The number of row value expressions in the INSERT statement exceeds
the maximum allowed number of 1000 row values.

It is actually documented here:

The maximum number of rows that can be inserted in a single INSERT
statement is 1000.

And here:

The maximum number of rows that can be constructed by inserting rows
directly in the VALUES list is 1000. Error 10738 is returned if the
number of rows exceeds 1000 in that case.

Note that the 1000 rows limit is only for a single VALUES clause. As commented by Lasse V. Karlsen :

It is not the INSERT statement that has a limit, it is the VALUES
clause. This is important if you do an insert that pulls data from
somewhere. That insert is only limited by memory/transaction
space/disk space.

SQL Server Maximum rows that can be inserted in a single insert statment

The Maximum number of rows you can insert in one statement is 1000 when using INSERT INTO ... VALUES... i.e.

INSERT INTO TableName( Colum1)
VALUES (1),
(2),
(3),...... upto 1000 rows.

But if your are using a SELECT statement to insert rows in a table, there is no limit for that, something like...

INSERT INTO TableName (ColName)
Select Col FROM AnotherTable

Now coming to your second question. What happens when an error occurs during an insert.

Well if you are inserting rows using multi-value construct

INSERT INTO TableName( Colum1)
VALUES (1),
(2),
(3)

In the above scenario if any row insert causes an error the whole statement will be rolled back and none of the rows will be inserted.

But if you were inserting rows with a separate statement for each row i.e. ...

INSERT INTO TableName( Colum1) VALUES (1)
INSERT INTO TableName( Colum1) VALUES (2)
INSERT INTO TableName( Colum1) VALUES (3)

In the above case each row insert is a separate statement and if any row insert caused an error only that specific insert statement will be rolled back the rest will be successfully inserted.

Maximum number of rows in one INSERT ALL

Although INSERT ALL does not have a theoretical maximum number of rows, in practice you will want to keep the number of rows to the low hundreds.

As I demonstrated in this answer, Oracle can easily handle hundreds of rows, but there's some magic number where the parse times start to grow exponentially. In older versions, things got really bad at 500 rows. With 19c, performance becomes an issue in the thousands of rows. Below are the results of a quick test:

# of Rows    Time in Seconds
--------- ---------------
1000 0.4
2000 1.7
3000 4
4000 12
5000 24

And for reasons I don't understand, the UNION ALL approach tends to work faster. So you might want to limit your number of rows to several hundred and use a statement like this:

INSERT INTO myTable (id, cola, colb, colc)
SELECT 'a', 'b', 'c' FROM dual UNION ALL
SELECT 'a', 'b', 'c' FROM dual UNION ALL
...
SELECT 'a', 'b', 'c' FROM dual;

SQL Server - Insert 2M+ records in SQL script with 7000 rows per insert

To properly answer your question, exactly as it is asked, no there is not a way natively within the SQL interface to overcome the INSERT limitation. You will need to create a programatic solution. I have listed some technoligies in my comment such as Python, PowerShell, and .NET. You could paste together a solution with BCP, BULK INSERT, SSIS, or some other BI tool. Here are some links for C# that talk about bulk insert a large dataset:

Insert 2 million rows into SQL Server quickly

Bulk inserting with SQL Server (C#)

Also there was a similar question asked and the accepted answer here suggests using SQL Server Import Wizard:

Import Wizard - Bulk Insert

How can I insert 100000 rows in SQL Server?

Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.

http://msdn.microsoft.com/en-us/library/ms188365.aspx

Not able to INSERT more than 1K records to temp table in SQL

Depending on where your actual values for your insert are coming from could change what/how to do the INSERT part. Are you doing the insert from a single connection or opening and closing a connection to use the data? This would change how you could do it as well.

You are using a table variable (the @ in front of name shows that). See my notes and code below for what is wrong and how to fix it.

use DatabaseName 

SET NOCOUNT ON

-- this is creating permeant table not a temp table or table variable.
CREATE Table TempRefundDetails (PolicyNumber NVARCHAR(10))

-- this inserts into a table variable not temp table
INSERT INTO @TempRefundDetails (PolicyNumber)



-- this will create a temp table
IF OBJECT_ID('tempdb..#TempRefundDetails') IS NOT NULL
DROP TABLE #TempRefundDetails

CREATE TABLE #TempRefundDetails (
PolicyNumber NVARCHAR(10)
)

INSERT INTO #TempRefundDetails (
PolicyNumber
)
VALUES (), () -- and so on

OR select from another table source directly into the table

INSERT INTO #TempRefundDetails (
PolicyNumber
)
Select PolicyNumber
From SomeTableNameHere

Inserting more than 1000 rows from Excel into SQLServer

Microsoft provides an import wizard with SQL Server. I've used it to migrate data from other databases and from spreadsheets. It is pretty robust and easy to use.



Related Topics



Leave a reply



Submit