Prashanth Jayaram

How to continuously read Transaction log file data directly in a SQL Server database with fn_dblog and fn_dump_dblog

August 23, 2018 by


In this article, we’ll discuss how to read SQL Server transaction logs. This article should provide and insight into how the workload is being captured in the transaction log files. We are going to see how to get the transaction meta-data details and the history of the data will give us a clear idea of how the system is working and helps to decide peak usage hours, also helps to log information about who is doing what?

In this article you’ll learn how:

  1. To read a SQL transaction log
  2. to prepare SQL to capture the transaction data
  3. to deploy, configure and manage data using T-SQL
  4. to schedule a job using the SQL Server Agent job
  5. And more ….

Importance of transaction log

The transaction log is a critical and vital component of the database. Every SQL Server database must have a transaction log and it records all transactions made on the database. If there is a system failure, accidental deletes, audits, point-in-time restores, recovery, or setting up High-availability, you will need a transaction-log to bring your database online.

Read SQL Server transaction log

In this article, we’ll discuss how to read transaction log to measure the number the Inserts, Updates, and Deletes operations. Also, discuss how easy is to find those affected objects against these transactions. Surprisingly, it’s not that complex to measure the entire process but obviously, you do need a database with FULL recovery model or SIMPLE if you get in there fast enough before the log clears. So what do you need to do? Well, first things first we need to look at the log.

In general, the transaction logs cannot be read directly because the data is in binary and it’s not in human-readable format. Indeed, the logged transactions details can be read using undocumented functions such as fn_dblog () and fn_dump_dblog (). This gives us an option to read each logged data in the transaction files. It can also be used in conjunction with preparing recovery solutions. Let us get statred to see how to use the fn_dbLog function, make sure to select columns that you’re interested in. fn_dblog is an organized transaction data-set to view the details of the transaction log. All of the transactions are carried forward in the same order in which it got created and each transaction is associated with specific LSN (Log Sequence Number).


fn_dblog (@startLSN,@endLSN)

@startLSN : Beginning of the transaction LSN

@endLSN: End of the transaction LSN

Let us use the following query to analyze the transaction log for the specific transaction.

You can find the type of the operation, begin date timestamp and end time and associate object using fn_dblog() function for the specific transaction. Let’s deep-dive into only those operations types that we are interested to capture using the SQL.

Most commonly used operation types are:

  1. LOP_BEGIN_XACT type is used to find the beginning of the transaction.
  2. LOP_INSERT_ROWS is to get the insert transaction detail
  3. LOP_MODIFY_ROW is to get the update transaction detail
  4. LOP_DELETE_ROWS is to get the delete transaction detail
  5. LOP_ABORT_XACT is to get the detail about rollback operation
  6. LOP_COMMIT_XACT, gives the details about the committed transaction

Prepare SQL

In this section, we are going to create dummy table and run through few DML operations (Insert, Delete and Update), and then query the log to see those statement. Let’s dissect the flow into sample SQL for better understanding.

  1. Query fn_dblog for the operation type LOP_INSERT_ROWS, LOP_MODIFY_ROWS and LOP_DELETE_ROWS to see what has happened with the all the objects

  2. Let’s join fn_dblog with the system objects to retrieve transaction information pertaining to only user-defined objects
  3. Next, getting how many rows were inserted, updated, or deleted alongside the end-time about when the respective transaction committed
  4. Fetch only those transaction happened in the last 5 minutes. In this case, the interval is defined as 5 minutes. You can change as per the requirement

  5. Next, pull the [Transaction SID] of each operation type ‘LOP_BEGIN_XACT’ and pass it into SUSER_SNAME () function to get the account details.
  6. Now, project all these value to get a count of each committed transactions at the object level, and based on the operation type using the Pivotal method

Here is the complete T-SQL that captures the details about how many rows were inserted, updated, or deleted alongside the time when the respective transaction committed

Analyzing the table, we can see that the type of transaction carried out on each user object of the database, by pivoting we can measure, how many rows were inserted, updated, or deleted alongside the time when the respective transaction committed.

Schedule a job

Let us simulate the reading transaction log file using T-SQL with the process known as continuous auditing and schedule a job to run as close to 5 minutes intervals using SQL Server Agent.

In this section, we’ll outline the steps required to capture the transaction type information and scheduled a job to run the Stored Procedure at every 5 minutes. It can also be done without creating a stored procedure.

Let us walk through the entire process using AdventureWorks2016 database.

  1. Create a stored procedure Audit_TransactionDetail. See Appendix for more information
  2. Create a table to log the transaction details

  3. Prepare insert SQL statement

  4. Create a job using SQL Server Agent
    1. Browse job folder
    2. Right-click and choose New Job
    3. Type in the Job Name. In this case its SQL_LOG_Operation

    4. Go to Steps
    5. In the job step
      1. Click New
      2. Type in the Step name
      3. Select the Database AdventureWorks2016
      4. Enter the T-SQL procedure call

    6. In the Schedules
      1. Click New
      2. Type in the schedule Name
    7. In the Frequency section, choose Daily
    8. In the Daily frequency section, choose to run every 5 minutes

    9. Test and verify the process

    I have performed various DML operations to see how the operations are getting tracked in log file using the following SQL

    Wrap up

    Thus far, we’ll discussed basics of transaction log, importance of log file, how to read transaction log to measure transaction rate of each object. The table can be further queried to get more granular report. Also, discussed how easy is to find those affected objects against these transactions using fn_dblog() function. It is also very useful technique to measure activities like page splits or objects being dropped etc:-.

    That’s all for now…


    Stored procedure

    Prashanth Jayaram
Auditing, Transaction log

About Prashanth Jayaram

I’m a Database technologist having 11+ years of rich, hands-on experience on Database technologies. I am Microsoft Certified Professional and backed with a Degree in Master of Computer Application. My specialty lies in designing & implementing High availability solutions and cross-platform DB Migration. The technologies currently working on are SQL Server, PowerShell, Oracle and MongoDB. View all posts by Prashanth Jayaram