In response to the composite indexes post, one of the readers had asked how to go about capturing the XML plans using profiler so that he can evaluate the plans later on.
Firstly, let us state that please do not run SQL Profiler directly on the production server. It can generate a lot of load and it is always advisable to run it from a separate machine. Having said that, now let’s take a look at which bare minimum events and columns can we use in order to capture the XML plans for us in the production environment. We will go ahead and create a new profiler template for this. Go ahead and open up SQL Profiler and then select File\Templates\New Template. Name the new template as XML_PLAN_CAPTURE. Then, go to the Events tab and under the Performance section, select “Showplan XML”. This event by itself is good enough but we should always select these events as well to help with the analysis:
- SQL:BatchStarting (Under the TSQL section)
- SQL:BatchCOmpleted (UNder the TSQL section)
- RPC:Completed (Under the Stored Procedures Section)
You can then set up your filters and also organize columns. One thing that we always like to do is to Organize the columns such that the EventClass falls under the Groups so that we can group the data using the EventClass. Depending upon what you are trying to analyze, one can also do the grouping based on: DatabaseName, ApplicationName, StartTime, Spid and EventClass. That way, for a given database, a given application, in a sequential order of starttime, all the data for a given spid will be captured grouped by the different events. That can make analysis a bit simpler.
As far as filters go, one can choose the SPID to be more than 50 to eliminate the system events and under the application name should elimiate the Profiler (%Profiler%) Management Studio (%Management%) and other applications that you do not wish to trace the data from. In addition, one can add a LoginName if you have separate logins for connecting to the instance as well as the DatabaseName if you wish to trace for any particular database only. Addition filters are very specific to the situation like capturing only those statements which have a duration of more than say 3 seconds (do note that the duration is in milli-seconds so choose the number correctly).
Now, once we have it, save that as a template and it will be present for you the next time you want to use it. One can also export this out as a *.tdf file and then send it out to team members to use it for their purposes – they would just need to double click it in order to register it.
Let’s run that trace now…we ran these statements on our healthcare database:
select top 10 * from dbo.pat
select top 10 * from dbo.insurance
SET NOCOUNT ON
set ins_address2 = ‘test’
where ins_number = 2
IF @@TRANCOUNT > 0
ERROR_NUMBER() AS ErrorNumber,
ERROR_SEVERITY() AS ErrorSeverity,
ERROR_STATE() AS ErrorState,
ERROR_PROCEDURE() AS ErrorProcedure,
ERROR_LINE() AS ErrorLine,
ERROR_MESSAGE() AS ErrorMessage;
One can then right click a specific XML execution plan and choose to extract the event data and save it with an extension of *.sqlplan to be able to then copy the execution plan in a graphical manner or just plain simple XML.
One thing that we forgot to mention before – when setting up a trace using this trace template or even when doing a trace for any XML event, you will see that there is a third tab at the top which states: “Events Extraction Settings”. Since we were tracing the XML events, we have the choice of saving the execution plans as a separate file with the extension of *.sqlplan. See below:
You can then run the trace and you will see that all the execution plans have now been captured in that output file. If you double click that file, it will open up in SQL Server Management Studio as a graphical execution plan file. See below:
So, using these simple steps, one can capture the XML showplans and then do analysis of what went wrong and which SQL statements need to be tuned and what were the runtime execution plans. One can schedule the SQL Profiler trace by using the server side stored procedures via a script – we will cover that in an upcoming post and will also see how one can schedule such scripts can capture the relevant data.