Jefferson Elias

How to perform a performance test against a SQL Server instance

September 14, 2018 by

Introduction

Intended audience

This document is intended for application developers and database administrators who plan to evaluate performance for any computer hosting a SQL Server instance.

Context

Let’s say a new physical server has been delivered and we have to test its performance in regards to its future destination, which is a database server. For newbies, there are a lot of questions that emerge:

  • What has to be tested?
  • What solutions can be used?
  • What information do they provide?
  • How to use them?
  • Is there a tool that does all the tests I need to get a good insight into server performance or is it better to use two or more tools in conjunction?

Most questions out there should lead us to following steps:

  1. Definition of the scope of the test
  2. Analysis and selection of the tools available to achieve test purposes
  3. If not already done, creation of a performance report template
  4. Usage of this report template to build actual performance tests report for the server
  5. Actual performance test with results documentation in report
  6. Analysis of performance test results (compared to a baseline)
  7. Feedback on report template for continuous improvement

Example of performance testing tools

There is a bunch of tools that we can use to do a performance test:

  • General purpose testing:
    1. PassMark PerformanceTest
    2. CPU-z
    3. CrystalDisk Mark
    4. Phoronix Test Suite
  • Database-related testing:
    1. Sqlio.exe
    2. HammerDb
    3. OStress

In conjunction with these tools, it’s worth using a database monitoring tool so that we can collect and historize performance statistics. Here are some of the tools that can be used:

  • sp_BlitzFirst®
  • Windows Performance Monitor (perfmon)
  • ApexSQL Monitor
  • Idera Diagnostic Manager

Here are the choices made for the purpose of this article:

  1. PassMark PerformanceTest to get a first overview of overall server performance
  2. Benchmarking with HammerDb and collecting performance statistics with sp_BlitzFirst®.

Getting server performance overview with PassMark PerformanceTest

The latest version for this tool is the 9th and can be downloaded for testing (and purchase) on its dedicated web page.

The installation is pretty straightforward, so it won’t be reviewed here.

Once PerformanceTest is installed, we can launch it. We will get following window.

As we can see, there are multiple options available. We will just click on “Run Benchmark” to run all the tests. We are prompted following message to which we can answer “Yes”.

And the benchmark begins…

Once the tests are done, you will get a score that you can compare to others or between computers in order to evaluate performance.

Note

In this example, the 3D Graphics Mark is grayed because I did the test on a virtual machine without 3D acceleration.

If we get to detailed test data, we can see that there is a preamble testing for database operations:

Performance testing with HammerDb

HammerDb Installation Process

Go to https://www.hammerdb.com/download.html to get back the latest version of HammerDB.

The installation process is straightforward and won’t be discussed.

SQL Server Performance statistics collection

Collection stored procedure

While HammerDb is running, we will collect performance statistics using the sp_BlitzFirst® stored procedure from the open source project called SQL Server First Responder Kit.

So, the first step is to get back latest version of this stored procedure and set this up to the target SQL Server instance we will be testing.

Preferably, this procedure should be installed in a database reserved to DBA usage, let’s call it [DBAMGMT].

We can check the stored procedure actually works with following statement:

Automating collection with a SQL Agent Job

This statement will be used to define a SQL Server Agent Job that will run every 10 seconds during the whole performance test.

Here is the code to create such a job:

Note

Previous script will drop an existing SQL Agent Job. It will also create the job disabled. It will be up to the tester to enable and disable the job for his tests.

One needs to set the database names in red according to its desire.

Some useful queries

You will find below some useful queries against collection tables that will allow the generation of report charts. Parts in red have to be changed before execution.

An excel worksheet model is available to generate charts for some statistics collected during benchmark test. Its name is “MDL-0020 – sp_BlitzFirst graphs”.

Page Life Expectancy

Batch requests / sec

Server wait total

Server wait by category

Test preparation

The HammerDb tools will create a database which will contain the schema and database objects used for benchmarking. This schema represents the activity of a firm with one or more warehouses.

If it’s the first time we run this tool against SQL Server, we ought to create some test databases with different numbers of warehouses.

Steps to create a test database

Let’s first review how to create a test database to use as a target for HammerDb.

To do so, we will first step create an empty database (with HammerDb) called [TPCC_x_WH] where x represents the number of warehouses we will create. The approximate (but unconfirmed) database size computation formula is 100Mb per warehouse.

Let’s start HammerDb.

We will double-click on SQL Server and select TPCC benchmarking option (for an OLTP simulation) then click on the “OK” button.

We will get a confirmation message:

And the display is now for SQL Server. We can expand the “SQL Server > TPC-C “ tree node, go down to “Schema Build” and double click on Options

We will be prompted a dialog in which we can set the actual target computer and databases so as the credentials to be used for testing.

Here is an example of one warehouse:

Once you hit “OK”, double-click on the “Build” leaf. A confirmation will be asked.

Once, we clicked on the “Yes” button, we will have to wait for HammerDb to complete the creation of its testing schema:

Once the creation is complete, you will see following message (green box).

Hit the red stop button to stop the thread used to create the database.

Now the database is ready and as we won’t stop at this point for performance testing, it’s worth taking a backup of this database.

Databases to create for benchmark

Four databases should be created and backed up on the lowest version of SQL Server with following a number of warehouses.

Number of warehouses

Experienced size on disk

Experienced backup size (compressed)

1

150 Mb

53 Mb

5

416 Mb

238 Mb

100

8100 Mb

4.5 Gb

500

39.75 Gb

22.6 Gb

Note

It’s strongly recommended to take backups of these databases as the creation process is really time-consuming.

Test Definition

Here are the steps to follow to fire a test against a given [TPCC_x_DW] database. We will assume you let HammerDb open.

Test case script generation

Expand “Driver Script” node and double-click on the “Options” leaf.

Actions to be done:

  1. Edit server, database and credential settings
  2. Select “Timed Test Driver Script”
  3. Keep the default ramp-up settings.
  4. Adjust the test duration to 10 minutes
  5. Keep “Use All warehouses” box unchecked
  6. Check “Time Profile” (Have a look at this documentation page for further details)
  7. Click on the “OK” button

The settings dialog should look as follows after step 6:

Note:

We could also check the “Checkpoint when complete” option if we use the AutoPilot mode that we will discuss in following subsection.

Now the options are set, we can generate the script that will be used by HammerDb. This is done by clicking on the “Load” leaf of tree hierarchy in the left panel:

Virtual users creation

Now it’s time to tell HammerDb how many users will be using the database defined in “Driver Script” options.

Virtual users refer to threads that will be created by HammerDb during the test. Their number can be configured in the “Virtual Users” node.

Most of the time, we will fire more than one test where only the number of virtual users vary. There are two ways to make it. The first one is to perform as many tests as virtual users count variations. The other one is to use the “Autopilot” mode of HammerDb.

To enable the Autopilot mode, we will expand the “Autopilot” node and double-click on “Options”. This opens up a dialog in which we will select the “Autopilot enabled”:

As we have a ramp-up of 2 minutes and test duration set to 10 minutes, we will set minutes per test value to 15 to be sure everything is ok.

We will also replace the default virtual user sequence by following:

3 5 9 17 25 37 49 65 97 129 197 257

Note

Feel free to create your own virtual user’s number sequence based on the requirements for the application that will be using SQL Server.

For instance, an alternative could be:

3 11 101 201 501 1001 1501 2001

Anyway, with the first sequence, the dialog will look like this:

We can click on the “OK” button to close the dialog.

Before digging into the execution of AutoPilot mode, there is a step not to forget: the activation of the performance statistics collection job.

This can be done with following T-SQL statement:

Test execution

Now, everything is ready and we can double-click on the Autopilot button:

It will open an “Autopilot” tab in the central pane:

As soon as a test is finished, there will be a “TEST RESULT” line showing the observed number of transactions per minute and the observed number of order created per minute.

We could keep track of these two numbers associated with the number of configured virtual users in order to create charts like this one:

You could also create charts with SQL Server statistics like “Page Life Expectancy” or “Average Wait Time (ms)”. As we can find the time window during which a test occurred, it’s pretty easy to restrict data sets to just this time window.

See more

Check out ApexSQL Plan to view and analyze SQL Server query execution plans, for free, including comparing plans, stored procedure performance profiling, missing index details, lazy profiling, wait times, plan execution history





Jefferson Elias

Jefferson Elias

Living in Belgium, I obtained a master degree in Computer Sciences in 2011 at the University of Liege.

I'm one of the rare guys out there who started to work as a DBA immediately after his graduation. So, I work at the university hospital of Liege since 2011. Initially involved in Oracle Database administration (which are still under my charge), I had the opportunity to learn and manage SQL Server instances in 2013. Since 2013, I've learned a lot about SQL Server in administration and development.

I like the job of DBA because you need to have a general knowledge in every field of IT. That's the reason why I won't stop learning (and share) the products of my learnings.

View all posts by Jefferson Elias
Jefferson Elias
Performance Testing

About Jefferson Elias

Living in Belgium, I obtained a master degree in Computer Sciences in 2011 at the University of Liege. I'm one of the rare guys out there who started to work as a DBA immediately after his graduation. So, I work at the university hospital of Liege since 2011. Initially involved in Oracle Database administration (which are still under my charge), I had the opportunity to learn and manage SQL Server instances in 2013. Since 2013, I've learned a lot about SQL Server in administration and development. I like the job of DBA because you need to have a general knowledge in every field of IT. That's the reason why I won't stop learning (and share) the products of my learnings. View all posts by Jefferson Elias

113 Views