How to log in T-SQL
I solved this by writing a SQLCLR-procedure as Eric Z Beard suggested. The assembly must be signed with a strong name key file.
using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
public partial class StoredProcedures
{
[Microsoft.SqlServer.Server.SqlProcedure]
public static int Debug(string s)
{
System.Diagnostics.Debug.WriteLine(s);
return 0;
}
}
}
Created a key and a login:
USE [master]
CREATE ASYMMETRIC KEY DebugProcKey FROM EXECUTABLE FILE =
'C:\..\SqlServerProject1\bin\Debug\SqlServerProject1.dll'
CREATE LOGIN DebugProcLogin FROM ASYMMETRIC KEY DebugProcKey
GRANT UNSAFE ASSEMBLY TO DebugProcLogin
Imported it into SQL Server:
USE [mydb]
CREATE ASSEMBLY SqlServerProject1 FROM
'C:\..\SqlServerProject1\bin\Debug\SqlServerProject1.dll'
WITH PERMISSION_SET = unsafe
CREATE FUNCTION dbo.Debug( @message as nvarchar(200) )
RETURNS int
AS EXTERNAL NAME SqlServerProject1.[StoredProcedures].Debug
Then I was able to log in T-SQL procedures using
exec Debug @message = 'Hello World'
Get login name in SQL SERVER
execute as login = 'LCF\jmp'
select distinct name
from sys.login_token
where principal_id > 0
and type = 'WINDOWS GROUP';
Where are SQL Server connection attempts logged?
You can enable connection logging. For SQL Server 2008, you can enable Login Auditing. In SQL Server Management Studio, open SQL Server Properties > Security > Login Auditing select "Both failed and successful logins".
Make sure to restart the SQL Server service.
Once you've done that, connection attempts should be logged into SQL's error log. The physical logs location can be determined here.
T-SQL Create User/Login not allowing connection
Make sure mixed mode authentication is enabled.
http://msdn.microsoft.com/en-us/library/ms188670(v=sql.105).aspx
You can easily check the authentication mode using
exec master.sys.xp_loginconfig 'login mode'
TSQL: TRY CATCH error handling with logging errors into a table
You have to be very careful with logging from CATCH locks. First and foremost, you must check the XACT_STATE()
and honor it. If xact_state is -1 (
'uncommittable transaction') you cannot do any transactional operation, so the INSERT fail. You must first rollback, then insert. But you cannot simply rollback, because you may be in xact_state 0 (no transaction) in which case rollback would fail. And if xact_state is 1, you are still in the original transaction, and your INSERT may still be rolled back later and you'll loose all track of this error ever occurring.
Another approach to consider is to generate a user defined profiler event using sp_trace_generateevent
and have a system trace monitoring your user event ID. This works in any xact_state state and has the advantage of keeping the record even if the encompassing transaction will roll back later.
I should mention that I always set XACT_ABORT
Stop doing this. Read Exception handling and nested transactions for a good SP pattern vis-a-vis error handling and transactions.
T-SQL Transaction Logging Issue
First, when you say,
Each night I use SSIS to pull in a large amount of records from our master database into my database
I presume this does not literally mean the system master database. If it does, there should not be user objects in there and it should not be processing or storing any user data there. If you are, migrate this to a user database.
Second, the short answer to the logging problem is this:
The default recovery model for a database is Full
. As noted in the comments, this means that no logs are overwritten until a back up occurs. This model works for point in time recovery and makes sense for a transactional system. So there are a few options:
- If this is a reporting system, it usually makes sense to put this into
simple recovery model
and do nightly backups. Especially if the data is only changing once per day. - If you do require up to the minute point in time recovery, log backups should be performed every 15 min. I'm guessing that the management and retention of these backups will make no sense for you and you should use option 1.
- When processing data in an SSIS dataflow, each buffer is committed one at a time. If you have not fiddled with the defaults, this means you are committing 10k rows at a maximum. So everything is already getting nicely chunked up. The problem then, is not that the batch sizes are too great, it is that you are in the wrong recovery model or that you are not backing up your logs often enough.
Related Topics
Good Resources for Relational Database Design
Selecting Entries by Date - >= Now(), MySQL
How to Return Second Newest Record in SQL
Oracle SQL Clause Evaluation Order
String or Binary Data Would Be Truncated. the Statement Has Been Terminated
How to Add a Variable Number of Hours to a Date in Postgresql
Which Is Better: Bookmark/Key Lookup or Index Seek
Creating a Udf(User Define Function) If Is Does Not Exist and Skipping It If It Exists
SQL Server Management Studio 2012 - Export All Tables of Database as CSV
What Is the Equivalent of 'Go' in MySQL
How to Find the Average Value in a Column of Dates in SQL Server
Copy Data Between Two Server Instances
Which Lock Hints Should I Use (T-Sql)
SQL Server Script to Create a New User
SQL Server: Get Data for Only the Past Year
Postgresql Alter Column Data Type to Timestamp Without Time Zone