Most Executed Stored Procedure

Most Executed Stored Procedure?

Use:

SELECT TOP 10 
qt.TEXT AS 'SP Name',
SUBSTRING(qt.text, qs.statement_start_offset/2, CASE WHEN (qs.statement_end_offset = -1) THEN LEN(qt.text) ELSE (qs.statement_end_offset - qs.statement_start_offset)/2 END) AS actual_query,
qs.execution_count AS 'Execution Count',
qs.total_worker_time/qs.execution_count AS 'AvgWorkerTime',
qs.total_worker_time AS 'TotalWorkerTime',
qs.total_physical_reads AS 'PhysicalReads',
qs.creation_time 'CreationTime',
qs.execution_count/DATEDIFF(Second, qs.creation_time, GETDATE()) AS 'Calls/Second'
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt
WHERE qt.dbid = (SELECT dbid
FROM sys.sysdatabases
WHERE name = '[your database name]')
ORDER BY qs.total_physical_reads DESC

Reference: SQL SERVER – 2005 – Find Highest / Most Used Stored Procedure

List of executed Stored Procedures and supplied parameters

Please try the following

SELECT *
FROM sys.procedures pr
inner join sys.parameters p
on p.object_id = pr.object_id

You can add and name = 'MyProcName'

If you want to get the last execution also: then use the following:

    SELECT  last_execution_time, *  
FROM sys.procedures pr
inner join sys.parameters p
on p.object_id = pr.object_id
INNER JOIN sys.objects b
ON pr.object_id = b.object_id
left join sys.dm_exec_procedure_stats a
on a.object_id = pr.object_id

Quickest way to identify most used Stored Procedure variation in SQL Server 2005

This will give you the top 50 most used procs and the statements in the procs, from here: Display the 50 most used stored procedures in SQL Server

SELECT TOP 50 * FROM(SELECT COALESCE(OBJECT_NAME(s2.objectid),'Ad-Hoc') AS ProcName,
execution_count,s2.objectid,
(SELECT TOP 1 SUBSTRING(s2.TEXT,statement_start_offset / 2+1 ,
( (CASE WHEN statement_end_offset = -1
THEN (LEN(CONVERT(NVARCHAR(MAX),s2.TEXT)) * 2)
ELSE statement_end_offset END)- statement_start_offset) / 2+1)) AS sql_statement,
last_execution_time
FROM sys.dm_exec_query_stats AS s1
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2 ) x
WHERE sql_statement NOT like 'SELECT * FROM(SELECT coalesce(object_name(s2.objectid)%'
and OBJECTPROPERTYEX(x.objectid,'IsProcedure') = 1
and exists (SELECT 1 FROM sys.procedures s
WHERE s.is_ms_shipped = 0
and s.name = x.ProcName )
ORDER BY execution_count DESC

Visit that link to grab the query for the proc name only, but I think this is a better query since it gives you the statements in the procs also

Stored Procedure is taking time in execution

I am guessing that you have a bad execution plan when the procedure is called from code. When you run the stored procedure, you will have a different plan because the plan signature is different when running from SSMS by default.

see: http://www.sommarskog.se/query-plan-mysteries.html for some possible fixes and clarity.

If your procedure is parameter heavy, it may make sense to compile every time using
option(recompile);

SQL: Stored procedure takes 20/25 seconds to run while the exact same select is almost instant running in a query window

Thanks for updating.

You can always try the index suggestions - it should be very simple to benchmark if it helps or not. My immediate thought is that neither will help, but practising how to benchmark in a controlled manner would be beneficial. Reading this question, we can tell that you've already tried numerous different things, and it's not really clear which changes helped the procedure behave more like the standalone, which changes improved both the stored procedure and the standalone and which made no difference but were kept anyway.

You haven't provided the actual execution plan that you want this to behave like, but let's see how much we can improve it.

Your time is mainly going towards the nested loops of the index seek of USER_ACCESS_MARKETS. This takes so long because it runs 539K times when it thought it would only execute 55 times. This bad estimate may have started with the join between ORDERS_WINDOW_CATALOGUES and CATALOGUES, it believes this will result in 3 rows but really returns 119 rows (this difference snowballs as joins are added). There's quite a few filters at play for this join, two are in the main join conditions that look to be the clustered index (which should be okay), then there's the c.IS_HISTORIC <> 1 and c.FK_SITE = @FK_SITE_2 filters, the latter two are probably what's responsible for the low cardinality. My guess would be c.FK_SITE = @FK_SITE_2 is the most important here, I would suggest you look at what this join looks like in your standalone version - where sniffing the FK_SITE_2 variable will be different.

To address this cardinality, you could create a temp table for CATALOGUES, like what you've already done with markets_catalogues. This would work if there is an efficient way of finding the rows that match these other filters. This will give SQL Server a chance to get estimates after populating this temp table - correcting it's estimation of the FK_SITE_2 filter.

SELECT * INTO #CATALOGUES_tmp FROM CATALOGUES WHERE FK_SITE = @FK_SITE_2 and IS_HISTORIC <> 1

Instead of addressing the cardinality estimates (there look to be plenty of sources of issues here and you will need to address plenty), you could just target the nested loops of USER_ACCESS_MARKETS by using a temp table so that it has to be hash joined:

SELECT * INTO #USER_ACCESS_MARKETS_tmp FROM USER_ACCESS_MARKETS WHERE PFK_ENTERPRISE = @PFK_ENTERPRISE_2 and (PFK_USER = @PK_USER_2 OR @PK_USER_2 IS NULL)

Or hint the join option - tell SQL Server it should just hash join to USER_ACCESS_MARKETS

inner hash join USER_ACCESS_MARKETS uam on
m.PFK_ENTERPRISE = uam.PFK_ENTERPRISE AND m.PK_MARKET = uam.PFK_MARKET

Those temp tables might cause other cardinality misestimates to start influencing the plan more so you'll need to take stock with the actual execution plan with each modification - see where the time is now going, see where the estimates are coming from.

Further optimization could come from noticing that it spends about a second aggregating 540K rows back down to 119 - this suggests that changing the joins to semi-joins or forcing a distinct earlier to reduce the amount of unnecessary results coming through could be possible. If you could do this so that the slow nested loop join to USER_ACCESS_MARKETS is only done a fraction of the time then that would also see huge benefits (and could still be a nested loop).

I've subqueried up part of the statement and placed a distinct in just before the join to USER_ACCESS_MARKETS, it's hard to predict how much impact this could have as I don't know anything about the PFK_MARKET column in market_catalogues (this column is the only additional column you care about from this join) but there's potential

ALTER PROCEDURE [dbo].[BL_GET_OW_AND_CATALOGUES_BY_SITE_FOR_ACTUAL_POSITION] 
@PFK_ENTERPRISE int,
@FK_SITE int,
@PK_USER int
WITH RECOMPILE
AS
SET ARITHABORT ON;

DECLARE @PFK_ENTERPRISE_2 int = @PFK_ENTERPRISE
DECLARE @FK_SITE_2 int = @FK_SITE
DECLARE @PK_USER_2 int = @PK_USER


SELECT sq.PK_ORDER_WINDOW, sq.OW_DESCRIPTION
FROM (
SELECT DISTINCT ow.PK_ORDER_WINDOW, ow.OW_DESCRIPTION, m.PFK_ENTERPRISE, m.PK_MARKET
FROM ORDERS_WINDOW ow
inner join ORDERS_WINDOW_CATALOGUES owc on ow.PFK_ENTERPRISE = owc.PFK_ENTERPRISE
and ow.PK_ORDER_WINDOW = owc.PFK_ORDER_WINDOW
inner join CATALOGUES c on owc.PFK_ENTERPRISE = c.PFK_ENTERPRISE
and owc.PFK_CATALOGUE = c.PK_CATALOGUE
inner join (SELECT * FROM markets_catalogues WHERE pfk_enterprise = @PFK_ENTERPRISE_2 and is_Active = 1 and FK_CATALOGUE_SETUP > 0) mc on
owc.PFK_ENTERPRISE = mc.PFK_ENTERPRISE and owc.PFK_CATALOGUE = mc.PFK_CATALOGUE
inner join MARKET m on
m.PFK_ENTERPRISE = mc.PFK_ENTERPRISE and m.PK_MARKET = mc.PFK_MARKET
WHERE (ow.PFK_ENTERPRISE = @PFK_ENTERPRISE_2) AND (ow.FK_ORDER_WINDOW_STATUS IN (1,2,3,5,6))
AND (c.IS_MAIN_CATALOG_RELATED = 1 OR c.FK_CATALOG_RELATED = 0)
AND c.FK_SITE = @FK_SITE_2
AND (c.IS_HISTORIC <> 1)
) sq
inner join USER_ACCESS_MARKETS uam on
sq.PFK_ENTERPRISE = uam.PFK_ENTERPRISE AND sq.PK_MARKET = uam.PFK_MARKET
AND (uam.PFK_USER = @PK_USER_2 OR @PK_USER_2 IS NULL)
inner join USERS u ON
uam.PFK_ENTERPRISE = u.PFK_ENTERPRISE AND uam.PFK_USER = u.PK_USER
group by sq.PK_ORDER_WINDOW, sq.OW_DESCRIPTION
ORDER BY sq.OW_DESCRIPTION ASC

I've also gone back to the version of the query without the temp table for markets_catalogues, you should try with and without putting the data in the temp table first.

I wonder if any of these plans look like what you are getting with the standalone statement.

Is it possible to grant a stored procedure execution rights that the user executing it does not have

As I mentioned in the comments, I would suggest using some basic logging.
Firstly, let's set up the tables that would be needed with minimal columns:

CREATE TABLE dbo.ExecutionLimit (ProcedureSchema sysname NOT NULL,
ProcedureName sysname NOT NULL,
UserName sysname NOT NULL,
ExecutionLimit int NOT NULL CONSTRAINT CK_ExecutionLimitMin CHECK (ExecutionLimit > 0),
ExecutionTimeFrame int NOT NULL CONSTRAINT CK_ExecutionTimeFrameMin CHECK (ExecutionTimeFrame > 0), --This is in minutes, but you could use something else
CONSTRAINT PK_ExecutionLimit_ProcedureUser PRIMARY KEY CLUSTERED(ProcedureSchema, ProcedureName, UserName));
GO

CREATE TABLE dbo.ProcedureExecution (ProcedureSchema sysname NOT NULL,
ProcedureName sysname NOT NULL,
UserName sysname NOT NULL,
ExecutionTime datetime2(1) NOT NULL CONSTRAINT DF_ExecutionTime DEFAULT SYSDATETIME());
CREATE CLUSTERED INDEX CI_ProcedureExecution ON dbo.ProcedureExecution (ExecutionTime,ProcedureSchema,ProcedureName,UserName);

Indexing is going to be important here, if you want a performant solution, as well as some kind of clean up process is you need it.

Then I'm going to create a couple of USERs and give one of them an execution limit (note that the procedure isn't created yet, but that's fine here):

CREATE USER SomeUser WITHOUT LOGIN;
CREATE USER AnotherUser WITHOUT LOGIN;
GO
INSERT INTO dbo.ExecutionLimit (ProcedureSchema,ProcedureName,UserName,ExecutionLimit,ExecutionTimeFrame)
VALUES(N'dbo',N'SomeProcedure',N'SomeUser',5, 60); --No more than 5 times in an hour

So SomeUser can only run the procedure 5 times within an hour interval, but AnotherUser can run it as often as they want (due to having no entry).

Now for the procedure. Here, you'll want to use an EXISTS to check if too many executions have been done within the procedure. As I mentioned, if too many executions have occured then I would THROW an error; I just use a generic one here, but you may want some more complex logic here. note I use ROWLOCK here to stop multiple simultaneous executions pushing over the limit; if this isn't likely to occur, you can remove that hint.

Then, after the check, I INSERT a row into the log, and COMMIT, so that the ROWLOCK is released. Then your procedure code can go afterwards.

CREATE PROC dbo.SomeProcedure AS
BEGIN

SET NOCOUNT ON;
SET XACT_ABORT ON;
BEGIN TRANSACTION;

IF EXISTS (SELECT 1
FROM dbo.ExecutionLimit EL
--Using ROWLOCK to stop simultaneous executions, this is optional
JOIN dbo.ProcedureExecution PE WITH (ROWLOCK) ON EL.ProcedureSchema = PE.ProcedureSchema
AND EL.ProcedureName = PE.ProcedureName
AND EL.UserName = PE.UserName
AND DATEADD(MINUTE,-EL.ExecutionTimeFrame,SYSDATETIME()) <= PE.ExecutionTime
WHERE EL.UserName = USER_NAME()
AND EL.ProcedureSchema = N'dbo'
AND EL.ProcedureName = N'SomeProcedure'
GROUP BY EL.ExecutionLimit --Needs to be, or will error
HAVING COUNT(PE.ExecutionTime) >= EL.ExecutionLimit) BEGIN

DECLARE @Message nvarchar(2047);
SELECT @Message = FORMATMESSAGE(N'The maximum number of executions (%i) within your allotted timeframe (%i minutes) has been reached. Please try again later.', EL.ExecutionLimit, EL.ExecutionTimeFrame)
FROM dbo.ExecutionLimit EL
WHERE EL.UserName = USER_NAME()
AND EL.ProcedureSchema = N'dbo'
AND EL.ProcedureName = N'SomeProcedure';

THROW 62462, @Message, 16;
END;

INSERT INTO dbo.ProcedureExecution (ProcedureSchema, ProcedureName, UserName)
VALUES(N'dbo',N'SomeProcedure',USER_NAME());
COMMIT;

--Do the stuff
PRINT N'Congratulations! You have run the procedure! :)'; --Obviously this wouldn't be in there.

END;
GO

You can then test (and clean up) the set up with the following:

GRANT EXECUTE ON dbo.SomeProcedure TO SomeUser,AnotherUser;

GO

EXECUTE AS USER = 'SomeUser';
GO

EXECUTE dbo.SomeProcedure;

GO 6

REVERT;
GO

EXECUTE AS USER = 'AnotherUser';
GO

EXECUTE dbo.SomeProcedure;

GO 6

REVERT;
GO

DROP TABLE dbo.ExecutionLimit;
DROP TABLE dbo.ProcedureExecution;
DROP PROC dbo.SomeProcedure;

GO
DROP USER SomeUser;
DROP USER AnotherUser;

If this is something you need in a lot of procedures (and the design I have here allows this) you might find it better to use a procedure to check, and THROW the error:

CREATE PROC dbo.CheckExecutions @Username sysname, @ProcedureSchema sysname, @ProcedureName sysname AS
BEGIN

SET NOCOUNT ON;
SET XACT_ABORT ON;
BEGIN TRANSACTION;

IF EXISTS (SELECT 1
FROM dbo.ExecutionLimit EL
--Using ROWLOCK to stop simultaneous executions, this is optional
JOIN dbo.ProcedureExecution PE WITH (ROWLOCK) ON EL.ProcedureSchema = PE.ProcedureSchema
AND EL.ProcedureName = PE.ProcedureName
AND EL.UserName = PE.UserName
AND DATEADD(MINUTE,-EL.ExecutionTimeFrame,SYSDATETIME()) <= PE.ExecutionTime
WHERE EL.UserName = @Username
AND EL.ProcedureSchema = @ProcedureSchema
AND EL.ProcedureName = @ProcedureName
GROUP BY EL.ExecutionLimit --Needs to be, or will error
HAVING COUNT(PE.ExecutionTime) >= EL.ExecutionLimit) BEGIN

DECLARE @Message nvarchar(2047);
SELECT @Message = FORMATMESSAGE(N'The maximum number of executions (%i) within your allotted timeframe (%i minutes) has been reached on the procedure ''%s.%s''. Please try again later.', EL.ExecutionLimit, EL.ExecutionTimeFrame, EL.ProcedureSchema, EL.ProcedureName)
FROM dbo.ExecutionLimit EL
WHERE EL.UserName = @Username
AND EL.ProcedureSchema = @ProcedureSchema
AND EL.ProcedureName = @ProcedureName;

THROW 62462, @Message, 16;
END;

INSERT INTO dbo.ProcedureExecution (UserName, ProcedureSchema, ProcedureName)
VALUES(@UserName, @ProcedureSchema, @ProcedureName);
COMMIT;
END
GO

CREATE PROC dbo.SomeProcedure AS
BEGIN

SET NOCOUNT ON;
SET XACT_ABORT ON;
DECLARE @UserName sysname = USER_NAME();
EXEC dbo.CheckExecutions @UserName, N'dbo', N'SomeProcedure';

--Do the stuff
PRINT N'Congratulations! You have run the procedure! :)'; --Obviously this wouldn't be in there.

END;
GO

Executing stored procedure takes too long than executing TSQL

Its caused by suboptimal plans being used.
You mention that the s.p. has parameters, I've had similar issues due to 'parameter sniffing'.

The quickest check to see if this is the issue is just to, inside the SP, copy the input parameters into local variables then use only the local variables.

This stops e.g. optimisation for certain paramater values at the expense of others.

I've had this before in an s.p. which had int parameters where certain parameter values changed the control flow (as well as how queries would be executed) a bit.

Different execution count for queries in a stored procedure

The Execution_Count field is defined as:

"Number of times that the plan has been executed since it was last compiled."

That would suggest that on on the fourth run some of the plans were re-compiled. I would suspect this happened due to the original plans falling out of the cache.

See https://msdn.microsoft.com/en-us/library/ms189741.aspx



Related Topics



Leave a reply



Submit