Bulk Insert Using Stored Procedure

Bulk insert using stored procedure

There's nothing wrong with your stored procedure code - the point is: the BULK INSERT command cannot accept a file name as a variable.

This does work:

BULK INSERT ZIPCodes 
FROM 'e:\5-digit Commercial.csv'
WITH

but this never works - within a stored proc or not:

DECLARE @filename VARCHAR(255)
SET @filename = 'e:\5-digit Commercial.csv'

BULK INSERT ZIPCodes
FROM @filename
WITH

So you just cannot do it this way, unfortunately. You could consider building up your BULK INSERT statement as a string (with a fixed file name) and then execute it as dynamic SQL - but I don't really see any other solution.

DECLARE @filepath nvarchar(500)
SET @filepath = N'e:\5-digit Commercial.csv'

DECLARE @bulkinsert NVARCHAR(2000)

SET @bulkinsert =
N'BULK INSERT ZIPCodes FROM ''' +
@filepath +
N''' WITH (FIRSTROW = 2, FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'')'

EXEC sp_executesql @bulkinsert

Using Dynamic SQL for a Bulk Insert in a Stored Procedure

You need to safely inject the values. You can't replace literals with variables. Due to the File path having the chance of being longer than 128 characters, i use REPLACE rather than QUOTENAME. I also assume that the delimiter is only 1 character in length:

CREATE PROCEDURE dbo.transfer_data @file_path nvarchar(255), @s_name sysname = N'dbo', @t_name sysname, @delimiter nchar(1) AS
BEGIN

DECLARE @SQL nvarchar(MAX),
@CRLF nchar(2) = NCHAR(13) + NCHAR(10);

SET @SQL = N'BULK INSERT ' + QUOTENAME(@s_name) + N'.' + QUOTENAME(@t_name) + @CRLF +
N'FROM N''' + REPLACE(@file_path,'''','''''') + N'''' + @CRLF +
N'WITH (FIELDTERMINATOR = N' + QUOTENAME(@delimiter,'''') + N',' + @CRLF +
N' ROWTERMINATOR = ''\n'');'

--PRINT @SQL;
EXEC sys.sp_executesql @SQL;
END;

Fastest way to bulk insert in SQL Table from C# SQLCLR Stored Procedure

You have a few options:

  1. Create a User-Defined Table Type (UDTT) and a T-SQL stored procedure that accepts that UDTT as a parameter (i.e. a Table-Valued Parameter (TVP)). Since you already have the data in a collection, absolutely do NOT create a separate DataTable to transfer that data into as that is just a waste of time, memory, and CPU. You can stream the data in its native collection directly into the TVP by creating a method that returns IEnumerable<SqlDataRecord>, uses yield return;, and is used as the value for the SqlParameter that represents the UDTT. I have two examples of this here on SO:

    • How can I insert 10 million records in the shortest time possible?
    • Pass Dictionary<string,int> to Stored Procedure T-SQL
  2. If there is only 1 table to populate (hence 1 result set), then you can output the collection as a result set from the stored procedure and use it as follows:

    INSERT INTO schema.table (column_list)
    EXEC schema.SQLCLR_stored_procedure @param1, ....;
  3. If there is only 1 table to populate (hence 1 result set), and if you aren't doing anything over the Context Connection that violates the rules of scalar functions, then you can try changing the SQLCLR stored procedure into a TVF, return IEnumerable, and iterate over the return via yield return; to stream it out. I wrote an article years ago with an example of this on SQL Server Central: CLR Table-Valued Function Example with Full Streaming (STVF / TVF)

Bulk insert file path as stored procedure parameter

Use dynamic SQL to inject the file name variable into a string with the bulk insert statement and the use sp_executesqlto execute it. You might want to add some error checking to check that the path is valid and so on.

CREATE PROCEDURE [importFile] (@filePath VARCHAR(MAX))
AS
BEGIN
CREATE TABLE #Temp
(
row1 int,
row2 varchar(5),
row3 bit
)

DECLARE @SQL NVARCHAR(MAX) = ''
SET @SQL = N'
BULK INSERT #Temp
FROM ''' + @filePath + '''
WITH (
FIELDTERMINATOR = '','',
ROWTERMINATOR = ''\n''
)'

-- ...

EXEC sp_executesql @SQL
END

-- to run it:
EXEC importFile 'd:\test.csv'

Stored procedure / bulk insert difficulties

Your double-quotes around the file, FIELDTERMINATOR and ROWTERMINATOR should be single quotes (double-them up since you're in dynamic SQL):

FROM '' ';
select @sql = @sql + @filePath;
select @sql = @sql + ' '' WITH(
FIRSTROW = 2
FIELDTERMINATOR = '','',
ROWTERMINATOR = ''\n''
)';

Batch insert/update using stored procedure

I've done something like this in the past:

CREATE PROCEDURE InsertProductIds(@ProductIds xml) AS

INSERT INTO Product (ID)
SELECT ParamValues.ID.value('.', 'VARCHAR(20)')
FROM @ProductIds.nodes('/Products/id') as ParamValues(ID)

END

Obviously this is just a single-column table, but the XML approach applies for multi-column tables as well. You then do this:

EXEC InsertProductIds @ProductIds='<Products><id>3</id><id>6</id></Products>'

Import text file into SQL server table using query or stored procedure

It is better to do this type of task with SSIS. But sometimes SSIS needs extra dependencies for the developers. So they try to do it using SQL Server.

So you can do this with the following SETPS. It'll dynamically read your text file from a specific location and also move to another after reading.

Suppose you have the following DB and table.

CREATE DATABASE [Codefirst]
GO

USE [Codefirst]
GO

CREATE TABLE [dbo].[tbl_NewVendorData](
[AccountNumber] [nvarchar](15) NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[ModificationDate] [datetime] NOT NULL,
PRIMARY KEY CLUSTERED
(
[AccountNumber] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

And you have the following text data in a text file that needs to insert into the table. After reading you also need to move the file to another location.
Just copy & paste the following data to your text file.



Leave a reply



Submit