Create Trigger to Log SQL That Affected Table

Create Trigger to log SQL that affected table?


My Solution

I added a trigger on the table in question that logged information i narrowed down via timestamps from sys.dm_exec_sql_text AND sys.dm_exec_query_stats. This quickly nailed down what i was looking for. Turns out there were a few triggers i didn't know about that were updating data after a UPDATE.

SELECT 
qStats.last_execution_time AS [ExecutedAt],
qTxt.[text] AS [Query], qTxt.number
FROM
sys.dm_exec_query_stats AS qStats
CROSS APPLY
sys.dm_exec_sql_text(qStats.sql_handle) AS qTxt
WHERE
qTxt.[dbid] = @DbId
AND qTxt.[text] like '%UPDATE%'
AND qStats.last_execution_time between @StartExecutionSearchTime and @EndExecutionSearchTime
ORDER BY
qStats.last_execution_time DESC

Create daily log table using triggers

You can do it via a trigger but cannot recreate the table stage because every time you recreate it (with the into) you lose the trigger. Try this pattern:

create table t21 (i1 int) -- source table
create table t21s (i1 int) -- stage table
create table t2log(i1 int, when1 datetime); -- log table
go
;
create trigger t_t21s on t21s after insert
as
set nocount on
insert into t2log(i1, when1)
select inserted.i1,getdate()
from inserted;


insert into t21 values (5)
-- every day or whenever you want to fill the staging table
truncate table t21s -- every day or period
insert into t21s (i1) -- fill up stage table without destroying trigger

select * from t21 -- see what is in stage

select * from t2log -- see what is in log

SQL Server: log database changes through generic trigger

Some errors with your current code include:

  • An error when 0 rows are affected, because all the tables would be empty but you did not handle a NULL command, and it generates an error when attempting to insert a NULL command into ChangeLog

  • Your cursor would string all affected rows into a strange fashion; even if you got it to work, if 4 rows were affected, you would have 1 row in your ChangeLog where the column_old_values would hold something like (col1, col1, col1, col1, col2, col2, col2, col2).

  • Your cursor would need dynamic SQL to use dynamic column names, but dynamic SQL is in a different scope compared to your code, so you need to make a copy of the inserted and deleted trigger-scope tables to use dynamic SQL.

  • Your dynamic SQL is trying to use variables that don't exist in the different scope. It's a lot easier to debug dynamic SQL if you put it into a string, then print the string for review before attempting to EXEC it.

EDIT:

How about this option, which doesn't rely on knowing the columns but does rely on knowing the table PK's beforehand? Those shouldn't change nearly as often as other columns, and the performance for this is vastly superior to what you were trying to do. This is a sample from one I've implemented on a table we weren't sure was still being utilized by one of our few dozen users, and I needed to track it over a year.

-- Create Trigger for Table to log changes
ALTER TRIGGER AUDIT_MyTableName
ON bookings
AFTER INSERT, UPDATE, DELETE
AS
BEGIN
SET NOCOUNT ON;

-- Grab trx type
DECLARE @command char(6)
SET @command =
CASE
WHEN EXISTS(SELECT 1 FROM inserted) AND EXISTS(SELECT 1 FROM deleted) THEN 'UPDATE'
WHEN EXISTS(SELECT 1 FROM inserted) THEN 'INSERT'
WHEN EXISTS(SELECT 1 FROM deleted) THEN 'DELETE'
ELSE '0 ROWS' -- if no rows affected, trigger does NOT record an entry
END

IF @command = 'INSERT'

-- Add audit entry
INSERT INTO ChangeLog (COMMAND, CHANGED_DATE, TABLE_NAME, /*COLUMN_NAMES,*/ COLUMN_OLD_VALUES, COLUMN_NEW_VALUES, USERNAME)
SELECT
Command = @command,
ChangeDate = GETDATE(),
TableName = 'bookings',
--ColNames = @column_names,
Column_OLD_Values = NULL,
Column_NEW_Values = (SELECT inserted.* for xml path('')),
Username = SUSER_SNAME()
FROM inserted

ELSE IF @command = 'DELETE'

-- Add audit entry
INSERT INTO ChangeLog (COMMAND, CHANGED_DATE, TABLE_NAME, /*COLUMN_NAMES,*/ COLUMN_OLD_VALUES, COLUMN_NEW_VALUES, USERNAME)
SELECT
Command = @command,
ChangeDate = GETDATE(),
TableName = 'bookings',
--ColNames = @column_names,
Column_OLD_Values = (SELECT deleted.* for xml path('')),
Column_NEW_Values = NULL,
Username = SUSER_SNAME()
FROM deleted

ELSE -- is UPDATE

-- Add audit entry
INSERT INTO ChangeLog (COMMAND, CHANGED_DATE, TABLE_NAME, /*COLUMN_NAMES,*/ COLUMN_OLD_VALUES, COLUMN_NEW_VALUES, USERNAME)
SELECT
Command = @command,
ChangeDate = GETDATE(),
TableName = 'bookings',
--ColNames = @column_names,
Column_OLD_Values = (SELECT deleted.* for xml path('')),
Column_NEW_Values = (SELECT inserted.* for xml path('')),
Username = SUSER_SNAME()
FROM inserted
INNER JOIN deleted ON inserted.bookingID = deleted.bookingID -- join on w/e the PK is
END

The result is perfectly functional for whatever you need:

results of ssms

If you're willing to change the column types for COLUMN_OLD_VALUES and COLUMN_NEW_VALUES to XML, you can simply add , type right after each for xml path('') and the XML is click-able and easy to read in SSMS.

Column_OLD_Values   = (SELECT deleted.* for xml path(''), type), 
Column_NEW_Values = (SELECT inserted.* for xml path(''), type),

Trigger to log inserted/updated/deleted values SQL Server 2012

Just posting because this is what solved my problem. As user @SeanLange said in the comments to my post, he said to me to use an "audit", which I didn't know it existed.

Googling it, led me to this Stackoverflow answer where the first link there is a procedure that creates triggers and "shadow" tables doing sort of what I needed (it didn't merge all values into one column, but it fits the job).

Log changes to database table with trigger

Triggers are bad, I'd stay away from triggers.

If you are trying to troubleshoot something, attach Sql Profiler to the database with specific conditions. This will log every query run for your inspection.

Another option is to change to calling program to log its queries. This is a very common practice.

How to log changed values in Update trigger

Here's one way using COLUMNS_UPDATED. Trigger does not depend on column names, so you can add or remove columns without problem. I have added some comments in the query

create trigger audit on Items
after update
as
begin
set nocount on;
create table #updatedCols (Id int identity(1, 1), updateCol nvarchar(200))

--find all columns that were updated and write them to temp table
insert into #updatedCols (updateCol)
select
column_name
from
information_schema.columns
where
table_name = 'Items'
and convert(varbinary, reverse(columns_updated())) & power(convert(bigint, 2), ordinal_position - 1) > 0

--temp tables are used because inserted and deleted tables are not available in dynamic SQL
select * into #tempInserted from inserted
select * into #tempDeleted from deleted

declare @cnt int = 1
declare @rowCnt int
declare @columnName varchar(1000)
declare @sql nvarchar(4000)

select @rowCnt = count(*) from #updatedCols

--execute insert statement for each updated column
while @cnt <= @rowCnt
begin
select @columnName = updateCol from #updatedCols where id = @cnt

set @sql = N'
insert into [Events] ([RecordId], [EventTypeId], [EventDate], [ColumnName], [OriginalValue], [NewValue], [TenantId], [AppUserId], [TableName])
select
i.Id, 2, GetUTCDate(), ''' + @columnName + ''', d.' + @columnName + ', i.' + @columnName +', i.TenantId, i.UpdatedBy, ''Item''
from
#tempInserted i
join #tempDeleted d on i.Id = d.Id and isnull(Cast(i.' + @columnName + ' as varchar), '''') <> isnull(Cast(d.' +@columnName + ' as varchar), '''')
'
exec sp_executesql @sql
set @cnt = @cnt + 1
end
end

I have changed data type of TableName column of Events table to nvarchar.



Related Topics



Leave a reply



Submit