My values in money column are changed when bulk insert is used
I would recommend you to use decimal/numeric
instead of money
as money is not precise. You can check Should you choose the MONEY or DECIMAL(x,y) datatypes in SQL Server?
Do note that MONEY
data type has rounding errors. Also performance wise decimal is better. Check the blog by Aaron Bertrand: Performance / Storage Comparisons : MONEY vs. DECIMAL
Change Datatype of Column in SQL Server
Well the main issue seems large amount of data saved into the table. Your both attempts also seem fine. They both will definitely take time I must say as the data is large.
Each time you alter a column data type the SQL SERVER tries to convert existing data into targeted data type. Processing the conversion on large amount of data may cause delay in execution.
Moreover I wonder if you have any trigger on the table.?
Well! Finally I would suggest you following steps. Give it a try at least
- Remove any primary keys/indexes/constraints pointing to the old column, and disable any trigger (if there is any).
- Introduce a new nullable column with the new data type (even if
it is meant to be NOT NULL) to the table. - Now make an update query on the table which will set the new column value to the old column value. You can do updating in chunks while updating 1000/100000 batches of the records. And also you can apply conditions to the query for better results.
- Once you update all the table by setting new column values to old column then remove the NULL character to NOT NULL from designer (if it is meant to be NOT NULL).
- Drop/Delete the old column. Perform Select Query and Verify Your Changes.
Last Point I should add is your database transaction log is also full which can be shrunk but with some precautions. Here is very good example how to reset your transaction log. Should take a look at this too.
Hope This Helps. :)
How to bulk insert geography into a new sql server 2008 table
Well after several days of headache I have come to the conclusion that there is no answer. Not even the mighty ESRI has any clue. Thankfully I did come up with a different soultion. In my table definition I created an NVARCHAR(MAX) column to hold the WFT of my geography and added that WFT to my csv file, and then after the bulk insert I run a table wide update statment to convert tht WFT to the actual geography type. Also adjust the csv file to use a different character besides a , to separate with becuase the WFT contains ,'s
SqlGeographyBuilder sql_geography_builder = new SqlGeographyBuilder();
sql_geography_builder.SetSrid(4326);
sql_geography_builder.BeginGeography(OpenGisGeographyType.Polygon);
sql_geography_builder.BeginFigure(-84.576064, 39.414853);
sql_geography_builder.AddLine(-84.576496, 39.414800);
sql_geography_builder.AddLine(-84.576522, 39.414932);
sql_geography_builder.AddLine(-84.576528, 39.414964);
sql_geography_builder.AddLine(-84.576095, 39.415015);
sql_geography_builder.AddLine(-84.576064, 39.414853);
sql_geography_builder.EndFigure();
sql_geography_builder.EndGeography();
SqlGeography sql_geography = new SqlGeography();
sql_geography = sql_geography_builder.ConstructedGeography;
StreamWriter stream_writer = new StreamWriter("C:\\PROJECTS\\AMP\\test.csv");
stream_writer.AutoFlush = true;
stream_writer.WriteLine("1?123 TEST AVE?" + sql_geography.ToString() + "?");
stream_writer.Flush();
stream_writer.WriteLine("2?456 TEST AVE?" + sql_geography.ToString() + "?");
stream_writer.Flush();
stream_writer.WriteLine("9?789 TEST AVE?" + sql_geography.ToString() + "?");
stream_writer.Flush();
stream_writer.Close();
stream_writer.Dispose();
SqlConnection sql_connection = new SqlConnection(STRING_SQL_CONNECTION);
sql_connection.Open();
SqlCommand sql_command = new SqlCommand();
sql_command.Connection = sql_connection;
sql_command.CommandTimeout = 0;
sql_command.CommandType = CommandType.Text;
sql_command.CommandText = "BULK INSERT [SPATIAL_TEST].[dbo].[Table_1] " +
"FROM 'C:\\PROJECTS\\AMP\\test.csv' " +
"WITH (FIELDTERMINATOR = '?', ROWTERMINATOR = '\n') " +
"" +
"UPDATE [SPATIAL_TEST].[dbo].[Table_1] " +
"SET [geo] = geography::STPolyFromText([geo_string], 4326) ";
sql_command.ExecuteNonQuery();
sql_command.Dispose();
sql_connection.Close();
sql_connection.Dispose();
MessageBox.Show("DONE");
}
catch (Exception ex)
{ MessageBox.Show(ex.Message); }
SQL Server 2008: BULK INSERT csv - is it possible to choose fields?
Alternatively, you can bulk insert into a view.
See also this question.
Change huge table PK column data type
In addition to KLE's suggestion, the following queries might help:
To disable all constraints on the tables that reference oldTable try to execute the output of the following query:
SELECT 'ALTER TABLE ' + OBJECT_NAME(fk.parent_object_id) + ' NOCHECK CONSTRAINT ' + fk.name
FROM sys.foreign_keys fk
INNER JOIN sys.foreign_key_columns AS fkc ON fk.OBJECT_ID = fkc.constraint_object_id
WHERE OBJECT_NAME (fk.referenced_object_id) = 'oldTable'
To move all data into the new table, with alteration of the field try this:
INSERT INTO newTable
SELECT CONVERT(BIGINT, ID) AS ID, COL1, COL2, ..., COLN
FROM oldTable
To drop the old table:
DROP TABLE oldTable
To rename the new table to the old name:
sp_rename newTable, oldTable
To reenable all the constraints on the tables that reference oldTable, try to execute the output of the following query:
SELECT 'ALTER TABLE ' + OBJECT_NAME(fk.parent_object_id) + ' CHECK CONSTRAINT ' + fk.name
FROM sys.foreign_keys fk
INNER JOIN sys.foreign_key_columns AS fkc ON fk.OBJECT_ID = fkc.constraint_object_id
WHERE OBJECT_NAME (fk.referenced_object_id) = 'oldTable'
Hope it helps...
Method to bulk modify triggers in SQL Server database
OK, I just tested this by jamming your string in a few triggers (as a comment of course) and then running it. I am not advocating this as the correct way to do it, as this link will help you with the correct way to do dynamic sql https://dba.stackexchange.com/questions/165149/exec-vs-sp-executesql-performance
However, this does work and will help you understand how you would piece these things together to get to that point.
Note, any formatting difference between your triggers may cause this to miss some, so youll want to verify that 0on your own.
DECLARE @string VARCHAR(8000)='select @HostName = HostName, @ProgramName = Program_Name
from master..sysprocesses where SPID = @@SPID'
, @counter INT=1
, @Max INT
, @Sql VARCHAR(mAX)
;
IF OBJECT_ID('TempDB..#TrigUpdate') IS NOT NULL DROP TABLE #TrigUpdate;
CREATE TABLE #TrigUpdate
(
SqlVar VARCHAR(MAX)
, RowID INT
)
;
INSERT INTO #TrigUpdate
SELECT REPLACE(REPLACE(t.definition, @string, ''), 'CREATE TRIGGER', 'ALTER TRIGGER')
, Row_Number() OVER (ORDER BY t.Definition ASC) AS RowID
FROM sys.objects o
INNER JOIN sys.sql_modules t on o.object_id =t.object_id
WHERE o.type_desc='SQL_TRIGGER'
AND CHARINDEX(@string, t.definition,1)>0
;
SET @Max = (SELECT COUNT(*) FROM #TrigUpdate);
WHILE @Counter<=@Max
BEGIN
SET @sql = (SELECT SqlVar FROM #TrigUpdate WHERE RowID=@counter);
EXEC(@Sql);
SET @Counter=@Counter+1;
END
Related Topics
Getting an Error When Executing a Dynamic SQL Within a Function (SQL Server)
This SQL 'Order By' Is Not Working Properly
Mod Negative Numbers in SQL Just Like Excel
Order by in a SQL Server 2008 View
How to Back Up a Postgresql Database from Within Psql
Fill in the Date Gaps with Date Table
Grant Create View on Oracle 11G
Validation Rule in SQL and Access
Dynamic Pivot Table with Multiple Columns in SQL Server
Convert Rows to Columns in SQL
SQL Server Foreign Key to Multiple Tables
Odd Inner Join Syntax and Encapsulation
How to Use a Calculated Column by Another Calculated Column
Sqlite Multiple Autoincrement Columns
How to Return Last Inserted (Auto Incremented) Row Id in Hsql