Sql Server Trigger

MS SQL Trigger for ETL vs Performance

First of all I would suggest you code your table name into a smaller variable and not a character one (30 tables => tinyint).

Second of all you need to understand how big is the payload you are going to write and how:

  1. if you chose a correct clustered index (date column) then the server will just need to out data row by row in a sequence. That is a silly easy job even if you put all 200k rows at once.

  2. if you code the table name as a tinyint, then basically it has to write:

    • 1byte (table name) + PK size (hopefully numeric so <= 8bytes) + 8bytes datetime - so aprox 17bytes on the datapage + indexes if any + log file . This is very lightweight and again will put no "real" pressure on sql sever.
  3. The trigger itself will add a small overhead, but with the amount of rows you are talking about, it is negligible.

I saw systems that do similar stuff on a way larger scale with close to 0 effect on the work process, so I would say that it's a safe bet. The only problem with this approach is that it will not work in some cases (ex: outputs to temp tables from DML statements). But if you do not have these kind of blockers then go for it.

I hope it helps.

Trigger - is it necessary BEGIN / COMMIT TRAN

All DML statements are executed within a transaction. The DML within the trigger will use the transaction context of the statement that fired the trigger so all modifications, inside the trigger and out, are a single atomic operation.

SQL Server: Trigger vs Database Trigger

The diference is that the database trigger is for DDL commands (Create/Alter/Drop). The Object language.
The Table Trigger is for DML commands (insert/update/delete). The data language

SQL Server trigger: Delete From Table AFTER DELETE

Try the below code:

CREATE TRIGGER delCarFromA on cars
FOR DELETE
AS
DELETE FROM dbo.announc
WHERE aid IN(SELECT deleted.aid FROM deleted)
GO

SQL Server: log database changes through generic trigger

Some errors with your current code include:

  • An error when 0 rows are affected, because all the tables would be empty but you did not handle a NULL command, and it generates an error when attempting to insert a NULL command into ChangeLog

  • Your cursor would string all affected rows into a strange fashion; even if you got it to work, if 4 rows were affected, you would have 1 row in your ChangeLog where the column_old_values would hold something like (col1, col1, col1, col1, col2, col2, col2, col2).

  • Your cursor would need dynamic SQL to use dynamic column names, but dynamic SQL is in a different scope compared to your code, so you need to make a copy of the inserted and deleted trigger-scope tables to use dynamic SQL.

  • Your dynamic SQL is trying to use variables that don't exist in the different scope. It's a lot easier to debug dynamic SQL if you put it into a string, then print the string for review before attempting to EXEC it.

EDIT:

How about this option, which doesn't rely on knowing the columns but does rely on knowing the table PK's beforehand? Those shouldn't change nearly as often as other columns, and the performance for this is vastly superior to what you were trying to do. This is a sample from one I've implemented on a table we weren't sure was still being utilized by one of our few dozen users, and I needed to track it over a year.

-- Create Trigger for Table to log changes
ALTER TRIGGER AUDIT_MyTableName
ON bookings
AFTER INSERT, UPDATE, DELETE
AS
BEGIN
SET NOCOUNT ON;

-- Grab trx type
DECLARE @command char(6)
SET @command =
CASE
WHEN EXISTS(SELECT 1 FROM inserted) AND EXISTS(SELECT 1 FROM deleted) THEN 'UPDATE'
WHEN EXISTS(SELECT 1 FROM inserted) THEN 'INSERT'
WHEN EXISTS(SELECT 1 FROM deleted) THEN 'DELETE'
ELSE '0 ROWS' -- if no rows affected, trigger does NOT record an entry
END

IF @command = 'INSERT'

-- Add audit entry
INSERT INTO ChangeLog (COMMAND, CHANGED_DATE, TABLE_NAME, /*COLUMN_NAMES,*/ COLUMN_OLD_VALUES, COLUMN_NEW_VALUES, USERNAME)
SELECT
Command = @command,
ChangeDate = GETDATE(),
TableName = 'bookings',
--ColNames = @column_names,
Column_OLD_Values = NULL,
Column_NEW_Values = (SELECT inserted.* for xml path('')),
Username = SUSER_SNAME()
FROM inserted

ELSE IF @command = 'DELETE'

-- Add audit entry
INSERT INTO ChangeLog (COMMAND, CHANGED_DATE, TABLE_NAME, /*COLUMN_NAMES,*/ COLUMN_OLD_VALUES, COLUMN_NEW_VALUES, USERNAME)
SELECT
Command = @command,
ChangeDate = GETDATE(),
TableName = 'bookings',
--ColNames = @column_names,
Column_OLD_Values = (SELECT deleted.* for xml path('')),
Column_NEW_Values = NULL,
Username = SUSER_SNAME()
FROM deleted

ELSE -- is UPDATE

-- Add audit entry
INSERT INTO ChangeLog (COMMAND, CHANGED_DATE, TABLE_NAME, /*COLUMN_NAMES,*/ COLUMN_OLD_VALUES, COLUMN_NEW_VALUES, USERNAME)
SELECT
Command = @command,
ChangeDate = GETDATE(),
TableName = 'bookings',
--ColNames = @column_names,
Column_OLD_Values = (SELECT deleted.* for xml path('')),
Column_NEW_Values = (SELECT inserted.* for xml path('')),
Username = SUSER_SNAME()
FROM inserted
INNER JOIN deleted ON inserted.bookingID = deleted.bookingID -- join on w/e the PK is
END

The result is perfectly functional for whatever you need:

results of ssms

If you're willing to change the column types for COLUMN_OLD_VALUES and COLUMN_NEW_VALUES to XML, you can simply add , type right after each for xml path('') and the XML is click-able and easy to read in SSMS.

Column_OLD_Values   = (SELECT deleted.* for xml path(''), type), 
Column_NEW_Values = (SELECT inserted.* for xml path(''), type),

SQL Server trigger execution in batch updating

Triggers in SQL Server always execute once per batch - there's no option for "for each row" triggers in SQL Server.

When you mass-update your table, the trigger will receive all the updated rows at once in the inserted and deleted pseudo tables and needs to deal with them accordingly - as a set of data - not a single row



Related Topics



Leave a reply



Submit