How to Copy a Huge Table Data into Another Table in SQL Server

How to copy a huge table data into another table in SQL Server

If you are copying into a new table, the quickest way is probably what you have in your question, unless your rows are very large.

If your rows are very large, you may want to use the bulk insert functions in SQL Server. I think you can call them from C#.

Or you can first download that data into a text file, then bulk-copy (bcp) it. This has the additional benefit of allowing you to ignore keys, indexes etc.

Also try the Import/Export utility that comes with the SQL Management Studio; not sure whether it will be as fast as a straight bulk-copy, but it should allow you to skip the intermediate step of writing out as a flat file, and just copy directly table-to-table, which might be a bit faster than your SELECT INTO statement.

How to copy large number of data from one table to another in same database?

First of all, disable the index on TableB before inserting the rows. You can do it using T-SQL:

ALTER INDEX IX_Index_Name ON dbo.TableB DISABLE;  

Make sure to disable all the constraints (foreign keys, check constraints, unique indexes) on your destination table.

Re-enable (and rebuild) them after the load is complete.

Now, there's a couple of approaches to solve the problem:

  1. You have to be OK with a slight chance of data loss: use the INSERT INTO ... SELECT ... FROM ... syntax you have but switch your database to Bulk-logged recovery mode first (read before switching). Won't help if you're already in Bulk-logged or Simple.
  2. With exporting the data first: you can use the BCP utility to export/import the data. It supports loading data in batches. Read more about using the BCP utility here.
  3. Fancy, with exporting the data first: With SQL 2012+ you can try exporting the data into binary file (using the BCP utility) and load it by using the BULK INSERT statement, setting ROWS_PER_BATCH option.
  4. Old-school "I don't give a damn" method: to prevent the log from filling up you will need to perform the
    inserts in batches of rows, not everything at once. If your database
    is running in Full recovery mode you will need to keep log backups
    running, maybe even trying to increase the frequency of the job.

    To batch-load your rows you will need a WHILE (don't use them in
    day-to-day stuff, just for batch loads), something like the
    following will work if you have an identifier in the dbo.TableA
    table:

    DECLARE @RowsToLoad BIGINT;
    DECLARE @RowsPerBatch INT = 5000;
    DECLARE @LeftBoundary BIGINT = 0;
    DECLARE @RightBoundary BIGINT = @RowsPerBatch;

    SELECT @RowsToLoad = MAX(IdentifierColumn) dbo.FROM TableA

    WHILE @LeftBoundary < @RowsToLoad
    BEGIN
    INSERT INTO TableB (Column1, Column2)
    SELECT
    tA.Column1,
    tB.Column2
    FROM
    dbo.TableA as tA
    WHERE
    tA.IdentifierColumn > @LeftBoundary
    AND tA.IdentifierColumn <= @RightBoundary

    SET @LeftBoundary = @LeftBoundary + @RowsPerBatch;
    SET @RightBoundary = @RightBoundary + @RowsPerBatch;
    END

    For this to work effectively you really want to consider creating an
    index on dbo.TableA (IdentifierColumn) just for the time you're
    running the load.

Best method to move a large SQL Server table from one database to another?

If you are on SQL server I would tend to use the Import Export Wizard as the quick and easy method. It will fall over gracefully if there are issues.

1) Create the table in the destination database

2) Right click on the destination database and then Tasks-> Import Data

3) Connect to the Source server when prompted and then keep following the prompts

Hope it helps

Bulk insert in SQL Server database from one table to another

BULK INSERT imports from an external data file. If you already have the data in a SQL Server table, then you should do something like:

INSERT INTO NewTable (field1, field2, field3)
SELECT field1, field2, field3 FROM OldTable

DO NOT point BULK INSERT at your SQL Server database file. The .tbl file referenced in your example code is to a text file with delimited fields.

Copy data from a large table on a test server to a table on production server in SSIS

There are many approaches to transfer data between two servers. SSIS is not always the preferred one. Noting that 50 million rows are not always considered a large data set; It depends on the server resources, columns data types, and other factors.

The simplest way to import/ export data is to use the SSMS Import/Export wizard. Another approach is to use BCP as @Nick.McDermaid mentioned in the comments.

If you have limited physical resources, and you need to do this using SSIS, you can try loading data in batches as explained in the following article:

  • SQL OFFSET FETCH Feature: Loading Large Volumes of Data Using Limited Resources with SSIS


Related Topics



Leave a reply



Submit