How to Import a SQL Data File into SQL Server

How do I import a sql data file into SQL Server?

If you are talking about an actual database (an mdf file) you would Attach it

.sql files are typically run using SQL Server Management Studio. They are basically saved SQL statements, so could be anything. You don't "import" them. More precisely, you "execute" them. Even though the script may indeed insert data.

Also, to expand on Jamie F's answer, don't run a SQL file against your database unless you know what it is doing. SQL scripts can be as dangerous as unchecked exe's

Importing SQL Server database from a .sql file

Create an empty database or if you have one already target that one.

Open CMD with elevated privilege and run:

sqlcmd -S SERVERNAME -d MYDATABASE -U USERNAME -P PASSWORD -i C:\path\mysqlfile.sql -o C:\path\results.txt

basically:

  • -S: is your servername or localhost
  • -d: is the database you are targeting
  • -U: is the username
  • -P: is the password
  • -i: is the path to your .sql file
  • -o: is where the logs file will be saved so if you had problem during the importation you can debug them

How to import data from .txt file to populate a table in SQL Server

Using OPENROWSET

You can read text files using OPENROWSET option (first you have to enable adhoc queries)

Using Microsoft Text Driver

SELECT * FROM OPENROWSET('MSDASQL',
'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=C:\Docs\csv\;',
'SELECT * FROM PPE.txt')

Using OLEDB provider

SELECT 
*
FROM
OPENROWSET
('Microsoft.ACE.OLEDB.12.0','Text;Database=C:\Docs\csv\;IMEX=1;','SELECT *
FROM PPE.txt') t

Using BULK INSERT

You can import text file data to a staging table and update data from it:

BULK INSERT dbo.StagingTable
FROM 'C:\PPE.txt'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
)

Import CSV file into SQL Server

Based SQL Server CSV Import

1) The CSV file data may have , (comma) in between (Ex:
description), so how can I make import handling these data?

Solution

If you're using , (comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR like ||. Code would look like and this will handle comma and single slash perfectly.

2) If the client create the csv from excel then the data that have
comma are enclosed within " ... " (double quotes) [as the below
example] so how do the import can handle this?

Solution

If you're using BULK insert then there is no way to handle double quotes, data will be
inserted with double quotes into rows.
after inserting the data into table you could replace those double quotes with ''.

update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,'"','')

3) How do we track if some rows have bad data, which import skips?
(does import skips rows that are not importable)?

Solution

To handle rows which aren't loaded into table because of invalid data or format, could be
handle using ERRORFILE property, specify the error file name, it will write the rows
having error to error file. code should look like.

BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
TABLOCK
)

Import text file into SQL server table using query or stored procedure

It is better to do this type of task with SSIS. But sometimes SSIS needs extra dependencies for the developers. So they try to do it using SQL Server.

So you can do this with the following SETPS. It'll dynamically read your text file from a specific location and also move to another after reading.

Suppose you have the following DB and table.

CREATE DATABASE [Codefirst]
GO

USE [Codefirst]
GO

CREATE TABLE [dbo].[tbl_NewVendorData](
[AccountNumber] [nvarchar](15) NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[ModificationDate] [datetime] NOT NULL,
PRIMARY KEY CLUSTERED
(
[AccountNumber] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

And you have the following text data in a text file that needs to insert into the table. After reading you also need to move the file to another location.
Just copy & paste the following data to your text file.



































































AC001,Raju Ahmed,1989-11-18 00:00:00.000;
AC002,Robin Hossain,1991-12-08 00:00:00.000;
AC003,Raju Ahmed 2,1989-11-18 00:00:00.000;
AC004,Raju Ahmed 3,1989-11-10 00:00:00.000;
AC005,Raju Ahmed 4,1989-11-12 00:00:00.000;
AC006,Raju Ahmed 5,1989-11-19 00:00:00.000;
AC007,Raju Ahmed 6,1989-11-02 00:00:00.000;
AC008,Raju Ahmed 7,1989-11-10 00:00:00.000;
AC009,Raju Ahmed 8,1989-05-18 00:00:00.000;
AC010,Raju Ahmed 9,1989-10-13 00:00:00.000;
AC011,Raju Ahmed 10,1989-01-16 00:00:00.000;
AC012,Raju Ahmed 11,1989-01-14 00:00:00.000;

How to import .sql file into SQL Server Express

You can use Management Studio Express edition. You can download the latest version here - which will work against SQL Express 2005, 2008 and 2008 R2.

If you don't want to install SSMSE then you can use sqlcmd at a command prompt, e.g. something like this (assuming Windows auth and an instance called "SQLEXPRESS"):

sqlcmd -S .\SQLEXPRESS -E -i "C:\path\file.sql"

How to import data into SQL Server without using import and export wizard

Use Dynamic SQL and loop through the files.

DECLARE @intFlag INT
SET @intFlag = 1
WHILE (@intFlag <=48)
BEGIN

PRINT @intFlag

declare @fullpath1 varchar(1000)
select @fullpath1 = '''\\your_path_here\' + convert(varchar, getdate()- @intFlag , 112) + '_.txt'''
declare @cmd1 nvarchar(1000)
select @cmd1 = 'bulk insert [dbo].[Daily] from ' + @fullpath1 + ' with (FIELDTERMINATOR = ''\t'', FIRSTROW = 2, ROWTERMINATOR=''0x0a'')'
exec (@cmd1)

I used this technique several years ago, for a daily data load, and it worked perfectly fine. In this example, the file name has the date in it, so I'm looping through all files, identifying each by the date.

Importing JSON data from a JSON file into SQL Server

Try an Outer Apply and note the use of AS JSON for Rows. I'm just bringing the first attribute back.

SELECT a FROM OPENJSON(@JSON)
WITH (
Rows nvarchar(max) AS JSON) Rows
OUTER APPLY OPENJSON(Rows.Rows)
WITH (
a nvarchar(254) '$.a') MyObject


Related Topics



Leave a reply



Submit