Import CSV File into SQL Server

Import CSV file into SQL Server

Based SQL Server CSV Import

1) The CSV file data may have , (comma) in between (Ex:
description), so how can I make import handling these data?

Solution

If you're using , (comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR like ||. Code would look like and this will handle comma and single slash perfectly.

2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?

Solution

If you're using BULK insert then there is no way to handle double quotes, data will be
inserted with double quotes into rows.
after inserting the data into table you could replace those double quotes with ''.

update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,'"','')

3) How do we track if some rows have bad data, which import skips?
(does import skips rows that are not importable)?

Solution

To handle rows which aren't loaded into table because of invalid data or format, could be
handle using ERRORFILE property, specify the error file name, it will write the rows
having error to error file. code should look like.

BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
TABLOCK
)

How to import Data in csv file into a SQL Server database using C#

I maintain a package Sylvan.Data.Csv that makes it very easy to bulk import CSV data into SQL Server, assuming the shape of your CSV file matches the target table.

Here is some code that demonstrates how to do it:

SqlConnection conn = ...;

// Get the schema for the target table
var cmd = conn.CreateCommand();
cmd.CommandText = "select top 0 * from User_Shop";
var reader = cmd.ExecuteReader();
var tableSchema = reader.GetColumnSchema();

// apply the schema of the target SQL table to the CSV data.
var options =
new CsvDataReaderOptions {
Schema = new CsvSchema(tableSchema)
};

using var csv = CsvDataReader.Create("dataImport.csv", options);

// use sql bulk copy to bulk insert the data
var bcp = new SqlBulkCopy(conn);
bcp.BulkCopyTimeout = 0;
bcp.DestinationTableName = "User_Shop";
bcp.WriteToServer(csv);

On certain .NET framework versions GetColumnSchema might not exist, or might throw NotSupportedException. The Sylvan.Data v0.2.0 library can be used to work around this. You can call the older GetSchemaTable API, then use the Sylvan.Data.Schema type to convert it to the new-style schema IReadOnlyCollection<DbColumn>:

DataTable schemaDT = reader.GetSchemaTable();
var tableSchema = Schema.FromSchemaTable(schemaDT);


Related Topics



Leave a reply



Submit