Business transformation requires solid tools to automate complex integration to seamless deployments. In today’s modern data-rich world, nothing is more important than data management, making it critical to know how to safeguard and meet compliance requirement is very critical and the key to the business success.
In this article, we’ll discuss one method to utilize one of the most powerful and beneficial features in SQL Server, the transaction log. The transaction log is not meant to be read, let alone queried, and even when information is extracted, it can different to interpret. So imagine if you could actually query the SQL Server transaction log like any other table and get human-readable information. Imagine the wealth of information about the transactional history and patterns of your database, that you could glean?
In this article, we are going to show you how to do just that and create an example of how to gain valuable aggregate information about our database including measuring aggregate information about the number the Inserts, Updates, and Deletes operations and we’ll also implement before and after auditing, for changed values. Also, we’ll discuss how easy it is to find those affected objects against these transactions and simplify the auditing process by extracting the transaction details and storing it in the repository. The data will help you to experience and give an insight on how the workload is being captured and produced in the database
In the previous article, How to continuously read Transaction log file data directly in a SQL Server database with fn_dblog and fn_dump_dblog, we discussed the similar setup and process of reading the transaction log and logging the details in the repository with fn_dbLog. In this article, you’ll see how to automate the process by reading online transaction log data into SQL Server tables with an execution of batch file using a 3rd party SQL Server transaction log reader.
For this article, we’ll use ApexSQL Log, a SQL Server transaction log reader from ApexSQL. We’ll simulate the process of reading a transaction log using ApexSQL Log export option along with continuous auditing features to log the transaction meta-data. And, the job can be scheduled to run every 5 minutes (or every night etc) using the available job scheduler utilities. You can refer to the previous articles if you’re intended to create a job using PowerShell and SQL Server Agent. In this example, we are going to discuss the job scheduling tasks using the Windows Task Scheduler and a batch file.
In this article you’ll learn how:
- to see an aggregate level overview of SQL transaction log data
- to prepare SQL to capture the transaction rate
- to gain hands-on experience in the creation of ApexSQL Log CLI commands
- to deploy, configure and manage audit data synchronization steps using ApexSQL Log Export option
- to schedule a job using the Windows Task Scheduler
- to ensure seamless data export from source to target with audit data
- And more ….
How it works…
In this section, we’ll outline the process of generating an ApexSQL Log CLI commands in a windows batch (BAT) file and then call the batch file in a scheduled window to run at every minute using Windows Task Scheduler.
Let us walk through the entire process using Adventureworks 2016 database.
- Backup the source database
- Restore the backup file on the target instance
- Build a batch file using ApexSQL Log
Continuous auditing preserves the LSN chain sequence. This mechanism provides the session continuity so that the transactions are read is consistent and it can replay the transaction to the target by maintaining the integrity of the transactions. Continuous transaction log auditing is supported in every output option
- In the Export, select Export to database option.
- Browse the Path and type in the target database instance details
Save the batch file to the known file location.
The content of batch file Log-AuditExport.bat is shown below
That’s all… the configuration is completed. Click Cancel to close the window
In this section, we detail the steps to perform to schedule a job using the Task Scheduler.
Open Task Scheduler and select Create Task…
Type in the Name of Job and select “Run Whether user is logged on or not” as shown below
- On the Triggers tab and click New
Enter the Schedule details as per the requirement. In this case, we are gathering audit data every 5 minutes.
- Select Actions tab and Click New
- Select the path of the batch file in the Program/script setting
Next, the process will prompt you to enter the credentials. Type in the required detail and Click OK. Make sure, you’re entering the administrative privileges,
Configuration is complete. Now, run the job.
Verifying the results
Use ApexSQL Log export feature is used to export meta-data of each transaction into SQL table. On executing the ApexSQL Log CLI command, the process inserts the transactions read from the transaction logs into the APEXSQL_LOG_OPERATION_DETAIL and APEXSQL_LOG_OPERATION tables.
APEXSQL_LOG_OPERATION table contains the columns such as operation type, object, schema, user etc.
And APEXSQL_LOG_OPERATION_DETAIL table contains LSN, column name and type, old and new values.
Thus far, we have seen that how is read the transaction log file and log the transaction meta-data using ApexSQL Log. Using ApexSQL Log, the column values, old as well new values are logged in the table. This gives more flexibility for quick data mining search to see the state of the data and in some case, it is very useful when the backup is invalid.
Other biggest advantage is logging the data in the local or remote SQL instance. In this way, the data can be secured and use it later for reporting and even in many cases, the data is used for compliance auditing.
If you’re ready for the test, give it a try by downloading the free trial of ApexSQL Log, setting up a similar scenario for continuous auditing and let me know if you have any problems in the comments below
My specialty lies in designing & implementing High availability solutions and cross-platform DB Migration. The technologies currently working on are SQL Server, PowerShell, Oracle and MongoDB.
View all posts by Prashanth Jayaram