Schedule Import CSV to SQL Server 2014 Express Edition

Schedule import csv to SQL Server 2014 Express edition

You can use Windows Task Scheduler to run bcp commands automatically. The command that will be run automatically will import your csv file using bulk copy program (bcp) utility. It can import or export data from/to files. For example to import a csv file to a table in SQL Server, you can use command like this:

bcp.exe dbo.MyTable in "C:\Some Folder\Data.csv" -s MYPC\SQLEXPRESS -d MyDatabase -U LoginName -P StrongP@ssw0rd

Where:

  • dbo.MyTable is the schema and table name, where data should be imported.
  • in tells the direction (put data in the database, or get data out of it).
  • "C:\Some Folder\Data.csv" is the name and path to the file holding the data to be imported.
  • MYPC\SQLEXPRESS is the computer and SQL Server instance name.
  • MyDatabase is the name of the database, where dbo.MyTable is.
  • LoginName and StronP@ssw0rd are the credentials to be used to connect to the server (or -E instead of -U and -P to connect using Windows Authentication).

Then create a new scheduled task (Start -> Task Scheduler -> Create Basic Task) and set a schedule according your requirements (e.g. daily at 3:00 AM) to run the command above.

Import CSV file into SQL Server

Based SQL Server CSV Import

1) The CSV file data may have , (comma) in between (Ex:
description), so how can I make import handling these data?

Solution

If you're using , (comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR like ||. Code would look like and this will handle comma and single slash perfectly.

2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?

Solution

If you're using BULK insert then there is no way to handle double quotes, data will be
inserted with double quotes into rows.
after inserting the data into table you could replace those double quotes with ''.

update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,'"','')

3) How do we track if some rows have bad data, which import skips?
(does import skips rows that are not importable)?

Solution

To handle rows which aren't loaded into table because of invalid data or format, could be
handle using ERRORFILE property, specify the error file name, it will write the rows
having error to error file. code should look like.

BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
TABLOCK
)

Having a data upload get picked up by SQL Server?

Best way would be to use SSIS as Brad suggested.

But if you are not familiar with SSIS packages then you can opt for simple sql server job to upload data from csv files periodically. Please have a look in below links:

Schedule import csv to SQL Server 2014 Express edition

https://answers.laserfiche.com/questions/74591/Auto-Import-csv-into-SQL-Table

Upload CSV file to SQL server

1st off, You don't need programming stuff. You can directly upload CSV files into SQL Database with SQL management tools. However, if you really need do it through programming, Just read below.

Personally, I think this approach is the most efficient and easiest way to do through programming.

In general, you can achieve it in two steps

1st step is to read the CSV file and hold the records as a DataTable.

2nd step is store the retrieved DataTable into SQL Database Table as a Bulk Entry

This is a function that returns CSV File Data as a DataTable. Call and Keep it in the memory and you can do whatever you want with it.

This function is going to return CSV Read file into DataTable.

private static DataTable GetDataTabletFromCSVFile(string csv_file_path)
{
DataTable csvData = new DataTable();
try
{
using(TextFieldParser csvReader = new TextFieldParser(csv_file_path))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
string[] colFields = csvReader.ReadFields();
foreach (string column in colFields)
{
DataColumn datecolumn = new DataColumn(column);
datecolumn.AllowDBNull = true;
csvData.Columns.Add(datecolumn);
}
while (!csvReader.EndOfData)
{
string[] fieldData = csvReader.ReadFields();
//Making empty value as null
for (int i = 0; i < fieldData.Length; i++)
{
if (fieldData[i] == "")
{
fieldData[i] = null;
}
}
csvData.Rows.Add(fieldData);
}
}
}
catch (Exception ex)
{
return null;
}
return csvData;
}
}

SQLBulkCopy - Use this function to insert the Retrieved DataTable into Sql Table

static void InsertDataIntoSQLServerUsingSQLBulkCopy(DataTable csvFileData)
{
using(SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=yourDB;Integrated Security=SSPI;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = "Your table name";
foreach (var column in csvFileData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(csvFileData);
}
}

Source

Import CSV into SQL Server (including automatic table creation)

SQL Server Management Studio provides an Import/Export wizard tool which have an option to automatically create tables.

You can access it by right clicking on the Database in Object Explorer and selecting Tasks->Import Data...

From there wizard should be self-explanatory and easy to navigate. You choose your CSV as source, desired destination, configure columns and run the package.

If you need detailed guidance, there are plenty of guides online, here is a nice one:
http://www.mssqltips.com/sqlservertutorial/203/simple-way-to-import-data-into-sql-server/

Best practices for importing large CSV files

I had this exact same problem about 2 weeks ago. I wrote some .NET to do ROW BY ROW inserts and by my calculations with the amount of data I had, it would take around a week to this it this way.

So instead I used a string builder to create one HUGE query and sent it to my relational system all at once. It went from taking a week to taking 5 minutes. Now I don't know what relational system you are using, but with enormous queries you'll probably have to tweak your max_allowed_packet param or similar.

Writing a csv file into SQL Server database using python

Consider building the query dynamically to ensure the number of placeholders matches your table and CSV file format. Then it's just a matter of ensuring your table and CSV file are correct, instead of checking that you typed enough ? placeholders in your code.

The following example assumes

  1. CSV file contains column names in the first line
  2. Connection is already built
  3. File name is test.csv
  4. Table name is MyTable
  5. Python 3
...
with open ('test.csv', 'r') as f:
reader = csv.reader(f)
columns = next(reader)
query = 'insert into MyTable({0}) values ({1})'
query = query.format(','.join(columns), ','.join('?' * len(columns)))
cursor = connection.cursor()
for data in reader:
cursor.execute(query, data)
cursor.commit()

If column names are not included in the file:

...
with open ('test.csv', 'r') as f:
reader = csv.reader(f)
data = next(reader)
query = 'insert into MyTable values ({0})'
query = query.format(','.join('?' * len(data)))
cursor = connection.cursor()
cursor.execute(query, data)
for data in reader:
cursor.execute(query, data)
cursor.commit()

Import Excel Spreadsheet Data to an EXISTING sql table?

Saudate, I ran across this looking for a different problem. You most definitely can use the Sql Server Import wizard to import data into a new table. Of course, you do not wish to leave that table in the database, so my suggesting is that you import into a new table, then script the data in query manager to insert into the existing table. You can add a line to drop the temp table created by the import wizard as the last step upon successful completion of the script.

I believe your original issue is in fact related to Sql Server 64 bit and is due to your having a 32 bit Excel and these drivers don't play well together. I did run into a very similar issue when first using 64 bit excel.



Related Topics



Leave a reply



Submit