Option (Recompile) Is Always Faster; Why

OPTION (RECOMPILE) is Always Faster; Why?

There are times that using OPTION(RECOMPILE) makes sense. In my experience the only time this is a viable option is when you are using dynamic SQL. Before you explore whether this makes sense in your situation I would recommend rebuilding your statistics. This can be done by running the following:

EXEC sp_updatestats

And then recreating your execution plan. This will ensure that when your execution plan is created it will be using the latest information.

Adding OPTION(RECOMPILE) rebuilds the execution plan every time that your query executes. I have never heard that described as creates a new lookup strategy but maybe we are just using different terms for the same thing.

When a stored procedure is created (I suspect you are calling ad-hoc sql from .NET but if you are using a parameterized query then this ends up being a stored proc call) SQL Server attempts to determine the most effective execution plan for this query based on the data in your database and the parameters passed in (parameter sniffing), and then caches this plan. This means that if you create the query where there are 10 records in your database and then execute it when there are 100,000,000 records the cached execution plan may no longer be the most effective.

In summary - I don't see any reason that OPTION(RECOMPILE) would be a benefit here. I suspect you just need to update your statistics and your execution plan. Rebuilding statistics can be an essential part of DBA work depending on your situation. If you are still having problems after updating your stats, I would suggest posting both execution plans.

And to answer your question - yes, I would say it is highly unusual for your best option to be recompiling the execution plan every time you execute the query.

option recompile faster when using local variables

Thanks to the responses I found the part of the documentation that explains why queries with local variables perform better when executed with "option (recompile)".

From the official documentation for recompile:

But in addition to using the procedure's current parameter values, the RECOMPILE query hint also uses the values of any local variables inside the stored procedure when you compile the statement.

That seems to be the difference to normal execution where local variables are ignored.

Ed Pollack on sqlshack.com

Because the value of local variables is not known until runtime, the query optimizer needs to make a very rough estimate of row counts prior to execution.

This would explain the observed behaviour.

Only solution to solve this:

  • literal values(not a good option)
  • stored procedures
  • use 'option (recompile)'

If you use EF Core, the only option is injecting the recompile for certain slow queries

My personal problem:

I took slow queries from "Azure Performance Insight" and used local variables to test them. Because of the "feature" described above this resulted in a different query plan than expected. EF Core uses a "parameterized query". Which performs like a stored procedure. So in the end the covering index was actually used on prod the whole time, just not by my local bad query.

Additional sources:

Jes Schultz - Parameter Sniffing in SQL Server

Gail Shaw - How to Confuse the SQL Server Query Optimizer

Option Recompile makes query fast - good or bad?

Rather than answer the question you asked, here's what you should do:

Update your statistics:

EXEC sp_updatestats

If that doesn't work, rebuild indexes.

If that doesn't work, look at OPTIMIZE FOR

Running SQL query with options(recompile) always faster

Option(recompile) generates a new plan based on the current statistics, at the expense of compiling a new plan. If your statistics are up to date then 99 times out of 100, you will get an optimal plan (rather than a cached query plan which might not be appropriate for the particular set of parameters you passed in, compared to when the plan was cached. This is termed parameter sniffing)

"Also what would be the drawbacks of leaving option(recompile) for
production code?"

That's fine for a reporting query, but would be a bad idea for a query running many times per second, since the cost of recompiling the plan each time could likely outweigh the cost of actually executing the query!

Also, be aware that Using Option(recompile) has the drawback that the procedure won't show up in the relevant DMV's.

Rule of thumb on when to use WITH RECOMPILE option

As others have said, you don't want to simply include WITH RECOMPILE in every stored proc as a matter of habit. By doing so, you'd be eliminating one of the primary benefits of stored procedures: the fact that it saves the query plan.

Why is that potentially a big deal? Computing a query plan is a lot more intensive than compiling regular procedural code. Because the syntax of a SQL statement only specifies what you want, and not (generally) how to get it, that allows the database a wide degree of flexibility when creating the physical plan (that is, the step-by-step instructions to actually gather and modify data). There are lots of "tricks" the database query pre-processor can do and choices it can make - what order to join the tables, which indexes to use, whether to apply WHERE clauses before or after joins, etc.

For a simple SELECT statement, it might not make a difference, but for any non-trivial query, the database is going to spend some serious time (measured in milliseconds, as opposed to the usual microseconds) to come up with an optimal plan. For really complex queries, it can't even guarantee an optimal plan, it has to just use heuristics to come up with a pretty good plan. So by forcing it to recompile every time, you're telling it that it has to go through that process over and over again, even if the plan it got before was perfectly good.

Depending on the vendor, there should be automatic triggers for recompiling query plans - for example, if the statistics on a table change significantly (like, the histogram of values in a certain column starts out evenly distributed by over time becomes highly skewed), then the DB should notice that and recompile the plan. But generally speaking, the implementers of a database are going to be smarter about that on the whole than you are.

As with anything performance related, don't take shots in the dark; figure out where the bottlenecks are that are costing 90% of your performance, and solve them first.

Option (recompile) speeds up query execution

This doesn't look like parameter sniffing.

The CompiledValue and RunTimeValue is the same for all parameters even in the bad.sqlplan




















Rather it looks as though you are benefitting from The Parameter Embedding Optimization.

The query text is cut off in the plan but I can see places where you are comparing parameters with literals. Perhaps this is a catch all query

Sample Image

When you use OPTION (RECOMPILE) the plan compiled only has to work for the passed parameter values. SQL Server can look at the value of the parameters passed and effectively replace the highlighted expressions with TRUE or FALSE.

This potentially allows it to simplify out whole branches of the plan if they are not relevant to the parameter values being passed.

A simple example of this would be

EXEC sys.sp_executesql
N'SELECT * FROM master..spt_values WHERE @Param <> 0 OPTION (RECOMPILE)',
N'@Param INT',
@Param = 0;

The plan is compiled when @Param=0 and it only has to be correct for this value so the @Param <> 0 can be evaluated at compile time as false and the plan doesn't access the table at all.

Sample Image

Without the OPTION (RECOMPILE) you see this

EXEC sys.sp_executesql
N'SELECT * FROM master..spt_values WHERE @Param <> 0',
N'@Param INT',
@Param = 0;

Sample Image

Even though the sniffed parameter value and the runtime parameter value are the same the plan cannot be optimised to the same extent as it will be cached and still need to work if passed a different parameter value.

Can option WITH RECOMPILE be used to reduce timeouts from blocking?

OPTION WITH RECOMPILE forces Sql Server to Recompile an execution plan even if there is an existing plan cached in the memory.

If the underlying data changes dramaticly and very quickly the cached execution plan becomes less efficient. Therefore in such situation using WITH RECOMPILE option executes the procedure much faster then it would execute if it had used the already compiled execution plan.

My guess the developer/person experienced some long delays when he executed the stored procedure. and when he used the WITH RECOMPILE option it got executed faster. So maybe he thought executing this procedure without recompile option causes blocking. :) funny story but I think this is what happened.

Why does Thunderbird recompile faster after the first compile?

I have not compiled Thunderbird for some time, but it appears that there is an option to remove the existing build tree by running a 'mach clobber' before you run 'mach build'. That should ensure that you lose all current build data and start fresh.

https://wiki.mozilla.org/MozillaBuild



Related Topics



Leave a reply



Submit