Kimberly Killian

Simple SQL Server automated user auditing

January 25, 2016 by

As a DBA I am often asked to provide lists of all active users ID’s or groups for a specific server or database. In addition to this, I am also asked to provide a list of failed logins. These requests are frequent enough that I decided to automate the process of gathering this data and provide two nicely formatted HTML emails. I am not going lie, manager types love these reports, and anything that makes managers love my work I am all about! I’m also all about automating anything that makes my job easier. Call me lazy or call me prepared, I hate having to do something over and over that I could easily throw into an SSIS package or Agent Job and just schedule it to do it for me. This entire process consists of using SQL Server Integration Services (SSIS), 4 tables and a SQL Agent Job containing 2 reports (Failed Logins and Active SQL Server Users). The SSIS package, along with all of the queries and scripts are attached at the end of the article.

I have created a database that I use for my DBA activities that I call DBA_Reports which is located on what I will call my main server. I also have a database on all servers that I monitor called DBA_Monitoring which I use for staging and gathering data. This one is optional just be sure to make the changes in the scripts if you choose to only use a single database for collecting data. Other than these two databases you will also need the following on your main server:

  • DBMail enabled on your server – this is absolutely key to this process. Also make note of your mail profile name as you will need it for the reports.
  • Depending how you are set up and what servers you are monitoring, you may need to create a linked server for each server. Or use a ProxyID.
  • Create yourself as an operator so you can receive notifications if your job fails.

Personally I like to add a notification on every job I create in case it fails. This is optional.

I’m not going to go into details on how to set these up as they are readily available from other sources, however, just consider them as prerequisites.

Step 1: Create the main server tables in DBA_Reports

  • Tables
    • MonitoredServerList – this will house the servers to be monitored and should be prepopulated by you with the ServerName of each server to be monitored.
    • AuthorizedUsers – this will house the authorized user information collected from each server monitored
    • FailedLogins – this will house the failed login information collected from each server monitored

On the main server you will need to create all 3 tables. Copy/paste/Run the scripts below to create the tables:

Step 2: Create the staging tables in DB_Monitoring

These are similar to the scripts in step 1 however there is a slight variation on the FailedLogins table so please use these scripts below. These tables are used to house the data collected from each specific server monitored. I consider this to be staging data. I like to keep it separate from the main database just in case I need to go back into it later or if something failed along the way. I want to see it in its raw form on its source server. Just my preference, so this is optional. You will see later where these are used in the SSIS package. You could always swap out these stage tables for temp tables of course, again optional.


  • Tables
    • FailedLogins – this will house the failed login information collected from each server monitored
    • AuthorizedUsers – this will house the authorized user information collected from each server monitored

On each monitored server in the DBA_Monitoring database you will need to create these 2 tables. Copy/paste/Run the scripts below to create the tables:

Step 3: Create a simple SSIS package.

This is a really simple SSIS package for someone that has experience with SSIS. It consists of 5 SQL queries, two connection managers and 2 variables used for connections. The package name is ActiveUsersAndInvalidLogins. The SSIS package is included at the end of this article along with all queries above and below.


These are used for connecting to the remote servers.

Connection Managers:

  1. MainServer – this is the connection where I keep DBA_Reports

  2. MonitoredServers – this is a connection to my remote servers that I am monitoring. In my case I use a proxyID called MonitoredServers on each server that has elevated permissions so it can crawl into each server and pull data. I also have a linked server to each monitored server on my main server. You don’t have to do this if you have an ID that is consistent across all servers. In this example, just create the connection manager and leave the server name blank and the database name blank.

  3. Inside the package

    The Package consists of 3 nodes. The first node will remove old data from the tables on the main server. This step is optional if you want to maintain a history but then you will need to change the queries on the reports which are discussed later. I want fresh data daily so I remove old data. This task is contained within a sequence container.

    The second node is going to populate our variable inside the foreach loop in the following step which is our server names so that we can pull data from the DBA_Monitoring database on each monitored server.

    Next we have a foreach loop container that contains 3 queries that are run on each server that we are monitoring then saves the data into our main server. This container has a variable mapping that connects to our User::SRV_Conn variable.

    The first query shown below is used to the clean staging tables.

    Our next query gets our Authorized SQL user information from our remote servers and writes the data into my main server.

    The final step retrieves the failed logins from the remote servers and inserts the data into the main server.

    Here is view of what the entire package looks like:

    Step 4: Reports and Agent Job

    I have created a SQL Agent job that contains two reports and is run on a schedule. In my case, I want to see these daily however this optional and you can make it run any schedule you want. I like daily specifically for invalid logins as those will defiantly change daily. Although, user access does not change daily in most cases, I personally like to run this audit daily as well. I have these reports in a single SQL Agent Job running at the same however they could easily be separated and run on different schedules.

    In the SQL Agent Job, on the steps branch, this job has 3 steps. Step 1 will call the SSIS package created Step 3 of this article, then after the data is collected, it will run two reports and email them to users listed.

    Here is a script of the entire agent job which also includes the 2 report scripts, make sure to change the variables marked <add your info here>. Copy/paste and run this script to create the job and reports.

    Here is an example of each report that is emailed


    Here you have a quick and simple way to collect both failed logins and active sql server users in a nice neat HTML generated formatted email.

    Included in this package, the following will be created when the scripts are run:

    • Tables:
      • MonitoredServerList
      • AuthorizedUsers
      • FailedLogins
    • Reports:
      • SQL Server – Authorized User IDs
      • SQL Server – Failed Login Attempts
    • SSIS Package:
      • ActiveUsersAndInvalidLogins

    Kimberly Killian

About Kimberly Killian

With over 20 years of total IT experience, Kim has a very diverse background. She has worked in positions as a programmer, IT architect, DB Engineer, DBA, ETL developer as well as Senior Level Technical Consultant. For the last 15 years has focused her technology experience within the world of database solutions and specifically with MS SQL Server. She has published numerous articles spotlighting SQL Server and is an expert in monitoring and reporting on SQL Server performance. View all posts by Kimberly Killian