Friday, November 13, 2009

SQL Server Recompile Bad Query Plans Based on Thresholds

Parameter sniffing is just a way of life with SQL Server (Any RDBS for that matter) and handling the issue is fairly limited. One solution is to hint the query plan, to the correct specification. This solution works as long as your dataset and traffic is fairly static, or grows in a linear fashion. For example if 95% of your calls are from Washington State, where 95% of your data resides then query hinting for Washington State is not such a bad idea. However if your dataset changes to represent a more balance US customer base, the hinting in place might run into issues. Even worse let's say your ratio of data and traffic doesn't change, but there is a press release in NY so 95% of data is still in Washington but, 90% of your traffic is looking at NY data, the hint to optimize Washington data will blow you up.

The more common approach is in recompiling your queries. This is the most used tool in a DBA's toolbox. The issue here is it's caught too often after plenty of damage is done. Depending on the robustness of application and database monitoring a bad query can be running for hours or days. The workaround to this is to write your code by using the with recompile option. This will recompile a sproc on every execution. This type of approach works great if your sproc is called less than 4 times a minute, however for often called sprocs this can cause more harm than good. The reason is the recompile will often take more resources and time then if the plan is good, which should be the case 99% of the time. To address this issue I've written a script below.


Script Explained

This leverages sys.dm_exec_query_stats and sys.dm_exec_sql_text DMV's, looking at the maximum amount of reads performed. If this threshold is hit it runs a recompile then alerts the DBA's that a recompile has occurred. Using the reads is specific to an issue we've seen, but worker time could be a better canidate for a bad query. We've scheduled it to run every 15 min through sql agent. One last thing is that it uses a global temp table over a local temp table, this deals with some issues on context switching of the mail sproc http://www.dbafieldnotes.com/2009/11/sql-server-using-spsenddbmail-and-temp.html


--Check Existance

if
exists(select *From tempdb.sys.tables where name = '##QStats')

begin

drop
table ##QStats

end

--Declare variables

Declare @objectname varchar(100)

Declare @ReadThreshold int

Declare @Reads int

Declare @Sub varchar(255)

Declare @bod varchar(500)

Declare @QueryVar varchar(1000)


--Fill in specifics

set @objectname ='sprocname'

set @ReadThreshold = 20000


--Applied before used may not be needed but easier to code

set @Sub = 'Sproc Recompiled for ' + @objectname

set @bod = 'A recompiled occurred on '+ @objectname +' Because the amount of reads was '+ convert(varchar(10), @reads)+' while the threshold is set to ' + convert(varchar(10), @ReadThreshold) +
' other stats are attached in body '


--Some nice stats to send in email

set @QueryVar =
'select plan_generation_num, creation_time, last_execution_time,execution_count,total_worker_time/execution_count As AVGWorkerTimeInMicrosecs,total_logical_reads/execution_count As AVGreads,total_logical_writes/execution_count As AVGWrites,total_elapsed_time/execution_count as AvgDurationnMicrosecs,total_clr_time/execution_count as AvgCLR from ##QStats'


--Put data into temp table since it might be used again in email

select
* into ##QStats from sys.dm_exec_query_stats qs cross apply sys.dm_exec_sql_text(qs.sql_handle) where objectid = object_id(@objectname)

--Get reads and assign logical reads

select @Reads = max(max_logical_reads) from ##QStats

If @Reads > @ReadThreshold

Begin

--Run Recompile

exec sp_recompile @objectname


--Mail DBA's with info.

exec msdb.dbo.sp_send_dbMail 'DBMailProfile',

@recipients ='test@test.com',

@Subject =@sub,

@body = @bod,

@importance ='High',

@Query = @QueryVar,

@Attach_Query_result_as_file = 1

end




No comments: