[EDIT 09/12] I have published another blog in the series. Download the already prepared charts from Automated Reports for SQL Server Perfmon data
If you are a DBA, Database consultant etc., you may come across one or more of these questions:
i. How is the current performance of my SQL Server instance (in terms of IO, Memory and CPU vs. the workload)?
ii. Has it reached its capacity? Can it survive additional workload/databases yet providing the optimal performance?
iii. I’m setting up a new server, how should I determine the storage requirements and size my hardware for a given workload?
iv. How should I determine whether the new hardware meets the storage requirements for a SQL server workload in my environment?
All of these questions majorly fall under Capacity Planning and/or Trend Analysis for SQL Server. To answer these questions, it becomes necessary that you collect and analyse the data over a period of time. Once you have the data, you can answer the first two questions. For the rest of the two, the data would serve as a baseline.
I prefer to collect the data from the various performance monitor counters over a period of time e.g 1-2 days, or a week etc., for this purpose. I was looking for a way so that this data collection/analysis can be automated and presented in the simplest form, as much as possible. This blog talks about the methodology used for such requirement.
Overview of the steps:
I. Collect the Performance counter data from the server.
II. Load the data into SQL Server database.
III. Import the data from SQL Server database into PowerPivot for SQL 2008 R2
IV. Draw the Pivot Charts.
1. The first step is to setup a user-defined Data Collector Set in the windows Performance Monitor (Perfmon). Download the sample template here. This includes various performance counters according to my requirements. You may add/remove the counters, change the output file path, sampling interval etc., as per your needs. Note: This is an xml template works on Win7/2008 onwards.
2. Once you have collected the Perfmon data, you may have several files with .blg extension depending upon the configuration you set for the data collector set. The next step is to combine the multiple .blg files into a single .blg file using relog.exe
For rest of the steps, I’d use my client machine (not the SQL server box itself ). Let’s say, you have two blg files DataCollector_1.blg and DataCollector_2.blg located at D:\PerfLogs\CP_Perfmon_Collector, go to the command prompt, change the directory to D:\PerfLogs\CP_Perfmon_Collector and run the following command which would combine those two blg files into a single file called DataCollector_Combined.blg
D:\PerfLogs\CP_Perfmon_Collector>relog.exe DataCollector_1.blg DataCollector_2.blg -f bin -o DataCollector_Combined.blg
3. Now load this perfmon data into SQL Server database using the following steps:
i. Create a database named Perfmon_DB
ii. Create a SQL Server ODBC System DSN named Perfmon_DSN. Point this DSN to use Perfmon_DB as the default database used. To do this:
a. Go to run prompt and type odbcad32.exe , hit enter. This will open up the ODBC Data Source Administrator
b. Go to System DSN tab, click Add then select SQL Server as the driver. Give it a name e.g. Perfmon_DSN, fill-in rest of the obvious details e.g. SQL Server, instance name, choose the default options in rest of the screen and click Finish. Make sure you change the default database to the database created above i.e. Perfmon_DB in our case.
c. Now, go to ODBC Data Source Administrator > System DSN > System Data Sources should list the newly created DSN
iii. Import the Combined BLG file into SQL Server database i.e. Perfmon_DB. To do this, open up a command prompt, navigate to the directory where the combined blg file is located and run the following command
relog “D:\PerfLogs\CP_Perfmon_Collector\DataCollector_Combined.blg” -f SQL -o SQL:Perfmon_DSN!logfile
iv. Verify the data imported successfully. You should have these three tables populated in the database Perfmon_DB
Note: I’ve used SQL Server 2008 R2 for this purpose.
4. Now you may analyse the Perfmon data querying above tables (see Querying Perfmon data from SQL for some sample queries). Move on to the next steps if you are interested in using the Powerpivot to generate those beautiful reports for analyses.
5. Download and install PowerPivot for SQL 2008 R2
6. Fire up Excel and launch the Powerpivot window under the menu PowerPivot
7. Under Powerpivot window > Home tab, click From Database > From SQL Server and fill-in server\instance\database details to create a connection. In the next window, choose Write a query that will specify data to import, click Next and paste this query:
SELECT YEAR(CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19))) ) as Year, MONTH(CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19)))) as Month, DAY(CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19)))) as Day, DATEPART(HH,CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19)))) as Hour, CounterDetails.ObjectName as [ObjectName], CounterDetails.InstanceName as [InstanceName], CounterDetails.CounterName AS [CounterName], AVG(CounterData.CounterValue) AS [CounterValue] FROM CounterData INNER JOIN CounterDetails ON CounterData.CounterID = CounterDetails.CounterID INNER JOIN DisplayToID ON CounterData.GUID = DisplayToID.GUID GROUP BY YEAR(CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19))) ), MONTH(CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19)))), DAY(CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19)))), DATEPART(hh,CONVERT(DATETIME, CAST(CounterDateTime AS VARCHAR(19)))), CounterDetails.ObjectName, CounterDetails.InstanceName, CounterDetails.CounterName
Note: This query groups the data for every hour. You may modify the query according to your needs. However, since I’m interesting in long-term analysis, I prefer to group them on “hourly basis”. This also helps with fast import into Powerpivot requiring less memory. Remember, it runs an OLAP query behind the scenes hence it needs the obvious system resources.
8. Validate the query and click Finish. This may take a little while to import the data in Powerpivot. Save the file.
9. Now go back to the Excel workbook, Powerpivot > PivotChart > New WorkSheet
a. You should see a Blank chart along with the Powerpivot field list
b. In the Powerpivot field list, under Query, I selected everything but Year and Month since I had 4-days of data from a particular month. I didn’t want to include Year/Month to the slicers, you may chose them depending upon your requirement, however. Next, drag and drop the different fields from Query to the different fields area at the bottom. Once done, the Powerpivot field list should look something like this:
Under the ∑ Values, you would notice it automatically takes the “sum” of the CounterValue. We need average value here. To get this, right-click on Sum of CounterValue > Edit > Edit Measure. Under Custom Name specify Average of CounterValue and select Average
c. Now, on the chart window, you should see the horizontal/vertical slicers along with the report filters. Example:
d. Adjust the vertical and horizontal slicers, filter the fields like Day, hour, InstanceName, CounterName etc to prepare different charts using the steps mentioned above (Step-9).
10. Now, whenever you refresh the data in your perform db, just do a Refresh All under Data menu in this excel workbook and all the data/charts would be refreshed to show the updated data. This excel workbook may serve as a template. If you want the charts to display the data from a different database containing the Perfmon database, you may follow this trick:
a. Create a copy of the excel workbook
b. Launch the PowerPivot window under PowerPivot menu
c. In the Powerpivot window , go to Design tab and manipulate the Existing connection or the query under Table Properties as required.
Here are some sample charts you may draw using above method. This is just to give you a glimpse of the power of power pivot.
I think Power View may further extend the reporting functionality here, I am yet to test it though.
Hope you find this useful. Ideas, suggestions, please feel free to post a comment!