Copy Data Between Two Server Instances

Copy data between two server instances

SQL - Linked Server

If both servers are SQL Server, you can set up Linked servers - I would suggest using an SQL account for security there.

Then you can simply perform

insert into server2.database1.dbo.table1 
select * from server1.database1.dbo.table1 where col1 = 'X'

If you run the query in SQL Management studio connected to server1, and current database set to database1, you won't need the prefix

server1.database1.dbo.

Also, the linked server would be configured on server1, to connect to server2 (rather than the other way around).

If you have the correct OLE DB drivers, this method can also work between different types of RDBMS (ie. non-SQL Server ones).

Open Query

Note: Beware not to rely on linked servers too much especially for filtering, and for joins across servers, as they require data to be read in full to the originating RDBMS before any conditions can be applied. Many complications can arise from Linked Servers, so read up before you embark, as even version differences might cause headaches.

I recommend you use the OPENQUERY command for SQL Servers to get around such limitations. Here's an example, but you should find help specific to your needs through further research:

insert into server2.database1.dbo.table1 
select * from OPENQUERY(server1, 'select * from database1.dbo.table1 where col1 = ''X''');

The above code is more efficient, filtering the data on the source server (and using available indexes), before pumping the data through, saving bandwidth/time/resources of both the source and destination servers.

(Also note the double quote '', is an escape sequence to produce a single quote.)

SQL - Temporarily on the same server

Would enable (note the underscore):

insert into server2_database1.dbo.table1 
select * from database1.dbo.table1

Still within the SQL query domain. If you can temporarily move the database on server2 to server1, then you won't need the linked server. A rename of the database would appear to be required while co-locating on server1. Achieving such co-location could use various methods, I suggest shrinking database files before proceeding with either:

  1. Backup/Restore - Backup on server2, Restore on server1 (with different name) - perform insert as described above, but without the server1 or server2 prefixes. Then reverse - backup on server1, restore on server2/
  2. Detach/Attach - Rename database, Detach on server2, (compress), copy files to server 1, (decompress), attach on server1, perform insert. Then reverse...

In either case, SQL Server version could be a barrier. If server1 is of a lower SQL version, then both backup and detach/attach methods will likely fail. This can be worked around by moving the server1 database to server2, which may or may not be more suitible.

Other Methods

May be suitable, non-SQL/TSQL method failing favorable environmental factors for previously mentioned methods. And if you have the correct access (OLE DB Drivers, etc..), this method can also work between different types of RDBMS (ie. non-SQL Server ones), and data-sources (such as XML, flatfiles, Excel Spreadsheets...)

  • SSIS Explicitly with Business Development Management Studio - direct datapump or using delimited file intermeditary.
  • SSIS Implicitly through SQL Management Studio, by right clicking the database1 on server1 > Tasks > Export, then completing the wizard. May work direct to server2, or using a flat-file intermeditary.
  • .Net Programming with SqlBulkInsert (I believe the SSIS datapump uses such an object), I can go into more detail about this, if it interests you.

Eg. of SQLBulkInsert (psedo-C# code)

SqlConnection c = new SqlConnection("connectionStringForServer1Database1Here");
SqlConnection c2 = new SqlConnection("connectionStringForServer2Database1Here");
c.Open();
SqlCommand cm = new SqlCommand(c);
cm.CommandText = "select * from table1;";
using (SqlDataReader reader = cm.ExecuteReader())
{
using (SqlBulkInsert bc = new SqlBulkInsert(c))
{
c2.Open();
bc.DestinationTable = "table1";
bc.WriteToServer(reader);
}
}

Pretty cool huh? If speed/efficiency is a concern - SqlBulkInsert based approaches (Such as SSIS) are the best.

Update - Modifying the destination table

If you need to update the destination table, I recommend that you:

  1. Write to a staging table on the destination database (a temporary table, or proper table which you truncate before and after process), the latter is preferable. The former may be your only choice if you don't have CREATE TABLE rights. You can perform the transfer using any one of the above options.
  2. Run a MERGE INTO command as per your requirements from the staging table to the destination table. This can Insert, Update and Delete as required very efficiently.

Such a whole process could be enhanced with a sliding window (changes since last checked), only taking recently changed rows in the source an applying to the destination, this complicates the process, so you should at least accomplish the simpler one first. After completing a sliding window version, you could run the full-update one periodically to ensure there are no errors in the sliding window.

How can I copy data records between two instances of an SQLServer database

If your production SQL server and test SQL server can talk, you could just do in with a SQL insert statement.

first run the following on your test server:

Execute sp_addlinkedserver PRODUCTION_SERVER_NAME

Then just create the insert statement:

INSERT INTO [PRODUCTION_SERVER_NAME].DATABASE_NAME.dbo.TABLE_NAME   (Names_of_Columns_to_be_inserted)
SELECT Names_of_Columns_to_be_inserted
FROM TABLE_NAME

Export table data from one SQL Server to another

Try this:

  1. create your table on the target server using your scripts from the Script Table As / Create Script step

  2. on the target server, you can then issue a T-SQL statement:

    INSERT INTO dbo.YourTableNameHere
    SELECT *
    FROM [SourceServer].[SourceDatabase].dbo.YourTableNameHere

This should work just fine.

Copy data between two SQL Server databases in C#

Why write code to do this?

The single fastest and easiest way is just to use SQL Server's bcp.exe utility (bcp: Bulk Copy Program).

  • Export the data from the source server.
  • Zip it or tar it if it needs it.
  • FTP it over to where it needs to go, if you need to move it to another box.
  • Import it into the destination server.

You can accomplish the same thing via SQL Server Management Studio in a number of different ways. Once you've defined the task, it can be saved and it can be scheduled.

You can use SQL Server's Powershell objects to do this as well.

If you're set on doing it in C#:

  • write your select query to get the data you want from the source server.
  • execute that and populate a temp file with the output.
  • execute SQL Server's bulk insert statement against the destination server to insert the data.

Note: For any of these techniques, you'll need to deal with identity columns if the target table has them. You'll also need to deal with key collisions. It is sometimes easier to bulk load the data into a perma-temp table first, and then apply the prerequisite transforms and manipulations to get it to where it needs to go.

Copy data of each table from server A to server B dynamically using SSIS

You can use C# SMO objects from a Script Task to do the transfer for a dynamic table list. The SSIS loop won't be necessary. The SSIS object variable (GlobalListOfTables) will need to be included in the ReadOnlyVariables field on the Script Task. Make sure to add Microsoft.SqlServer.SmoExtended and Microsoft.SqlServer.ConnectionInfo references to the Script Task in addition to those listed below.

using System.Data;
using Microsoft.SqlServer.Management.Smo;
using System.Collections.Generic;
using System.Data.OleDb;

string databaseName = "DatabaseName";
List<string> tableNames = new List<string>();
DataTable dt = new DataTable();
OleDbDataAdapter dataAdapter = new OleDbDataAdapter();

//get table names from SSIS object variable
dataAdapter.Fill(dt, Dts.Variables["User::SourceServerName"].Value);

//populate list
foreach (DataRow dr in dt.Rows)
{
tableNames.Add(dr[0].ToString());
}

//create source server object
Server srcServ = new Server(@"SourceServerName");
srcServ.ConnectionContext.LoginSecure = true;
srcServ.ConnectionContext.StatementTimeout = 600;
srcServ.ConnectionContext.Connect();

//define source database as smo object
Database sourceDatabase = srcServ.Databases["SourceDatabaseName"];

Transfer transfer = new Transfer();
transfer.Database = sourceDatabase;

//set destination server and database
transfer.DestinationServer = @"DestinationServerName";
transfer.DestinationDatabase = databaseName;

//overwrite objects if they exist
transfer.DropDestinationObjectsFirst = true;

transfer.CopyAllObjects = false;
transfer.CopySchema = true;

//include data
transfer.CopyData = true;

foreach (Table t in sourceDatabase.Tables)
{
//extract table names that were originally in SSIS object variable and avoid system objects
if (tableNames.Contains(t.Name) && !t.IsSystemObject)
{
transfer.ObjectList.Add(t);
}
}
//transfer objects
transfer.TransferData();

How can I copy data records between two instances of an SQLServer database

Here's how I did it. Thanks to the other respondants for the inspiration. The code that builds the mappings is not necessary if the schemas of the two tables are identical.

public void CopyTables(string sourceConnectionString, string destConnectionString)
{
string sql = "Select * From SourceTable";
using (SqlConnection sourceConn = new SqlConnection(sourceConnectionString))
using (SqlCommand sourceCmd = new SqlCommand(sql, sourceConn)) {
sourceConn.Open();

using (SqlDataReader reader = sourceCmd.ExecuteReader())
using (SqlBulkCopy copier = new SqlBulkCopy(destConnectionString)) {
copier.DestinationTableName = "DestinationTable";
copier.BulkCopyTimeout = 300;

DataTable schema = reader.GetSchemaTable();
copier.ColumnMappings.Clear();
foreach (DataRow row in schema.Rows) {
copier.ColumnMappings.Add(row["ColumnName"].ToString(), row["ColumnName"].ToString());
}

copier.WriteToServer(reader);
}
}
}
}

What is the fastest way to transfer view data between two Azure SQL Databases?

Did you think about using Data Factory?

It has the fast performance to transfer big data. Reference: Copy performance and scalability achievable using ADF:

Sample Image

You can learn from this tutorial: Copy and transform data in Azure SQL Database by using Azure Data Factory.

This article outlines how to use Copy Activity in Azure Data Factory to copy data from and to Azure SQL Database, and use Data Flow to transform data in Azure SQL Database.

Data Factory has many components can help you copy the data from many views to another Azure SQL database in one pipeline.

Hope this helps.



Related Topics



Leave a reply



Submit