Prashanth Jayaram

How to set up SQL Server Log Shipping on Linux

November 13, 2017 by

Log shipping is a high-availability configuration that perhaps most of us are familiar with. It’s one of the oldest techniques wherein we ship transaction logs from a Primary database to a Secondary database. Log Shipping is still a vital feature used in case of applications that use warm standby for Disaster Recovery. We can see many articles which discuss the process of configuring Log shipping using T-SQL or SSMS.

On a high level, Log Shipping consists of preparing the secondary database (NORECOVERY or STANDBY) using a backup of the primary instance database and then applying the transaction logs from the primary database onto the secondary database using SQL Server agent Jobs and a shared file location. In other words, it’s an automated process of restoring transaction log backups from the primary database server on to the secondary database/database server. This way, the database is available for failover.

We’ve seen multiple models of implementation of log shipping: log shipping on the same server, on to a different server, across domains or workgroups, or to different versions of SQL Server. SQL Server, now being available for Linux, makes us think, “Can we set up log shipping across platforms?”

As it turns out, yes, we can. This article discusses the setup. We’ll use a network file share for the demonstration. Let’s jump right in!

As shown in the above figure, log shipping is comprised of the following steps:

  • A File Share, the placeholder for the primary database transaction log files
  • A backup job created to backup transaction log file to the share
  • A copy job on the secondary used to copy the transaction log files from the share
  • A restore job to perform the restoration of the transaction log backup files

Prerequisites

  1. The SQL Server configuration must use the FULL or Bulk Logged database recovery model
  2. A File Share using the SMB protocol
  3. SQL Server Agent installation on the Linux instance
  4. The SQLCMD command reference
  5. The log-shipping stored procedures require membership in the sysadmin fixed server role.

Check the databases recovery model

Use the below SQL query to check the recovery model of the database.

Configure a File Share using the SAMBA protocol

We’re using CentOS for the procedure. The procedure may vary based on the Linux flavor/distribution you use.

  1. To find the workstation details, issue the net config workstation command in the cmd prompt on the SQL Server on Windows—the primary database server. In this case the workstation domain value is HQNT.

  2. To Install Samba 4.6.2 run YUM (the opensource command, Yellowdog Updater Modified). YUM is used to manage the installation libraries on RHEL or CENTOS Linux distributions.

  3. Edit the Samba configuration file /etc/samba/smb.conf. Before making an update or an entry, it is recommended to back up the configuration file. To back up the Samba configuration file use the cp command to create a new file

    Use any editor to edit the content of /etc/samba/smb.conf. I used VI. VIM or Nano should work, too. The entire content of the file should look like the text below.

    The content of the Samba configuration is shown below

  4. Run the testparm command to validate the configuration parameters of the Samba file. The testparm command parses the configuration file parameters and reports any syntactical issues, or the use of any unknown parameters of the configuration file.

  5. Now, create a fully accessed SQLBackup share.

  6. Restart the Samba service

  7. Open and load the firewall configuration using firewall-cmd command

  8. SELinux requires the files to be labeled. As we already created the directory /var/opt/SQLBackup, we can label the directory and contents using the chcon tool.

  9. Browse the Samba file share using smbtree. It’s the Linux version of Windows Network Neighborhood. The command prints all the nodes and shares within the domain.

  10. Check the Samba connection status

  11. Browse the share from windows machine

    Now that we have Samba set up, let’s move on to the next step.

  12. Install SQL Server Agent

    The SQL Server agent jobs are the backbone of the Log Shipping architecture. The Backup Job, Copy Job, Restore Job, Alert job… all run on the Primary or the Secondary databases instances. To install the SQL Server Agent, the following commands are executed. After the installation is complete, don’t forget to restart the SQL Service.

    Setup Log Shipping

    Log shipping can be implemented using the SSMS GUI or T-SQL. I feel more comfortable using T-SQL. The sqlcmd command gives us the flexibility to execute the SQL by connecting to various versions SQL Shell with reference to the current scope of execution.

    The first task is to prepare the secondary database instance to get it up and running, by initializing the secondary database. We do this by restoring a full backup of the primary database on the secondary server with no recovery option. The transaction log backup is initiated on the primary and then copied across to the secondary database. This process applies the log on the secondary instance(s) of the database.

    In this demonstration, HQ6021 is the Primary Server (running on Windows) and 10.2.6.10 is a Secondary Instance (running on Linux).

    Copy the below SQL and open a SQLCMD session on SSMS to execute the commands. This creates a database called SQLShackLogShipping on the Secondary instance.

    The first step in configuring log shipping is to initialize the secondary database, and restore the backup of the primary database on to the secondary instance.

    :CONNECT HQ6021 BACKUP DATABASE SQLShackLogShipping TO DISK = ‘\\centos\SQLShare\SQLShackLogShippingFull.Bak’ WITH INIT ,FORMAT ,COMPRESSION ,STATS = 5; GO BACKUP LOG SQLShackLogShipping TO DISK = ‘\\centos\SQLShare\SQLShackLogShippingLogTRN’ WITH INIT ,FORMAT ,COMPRESSION ,STATS = 5; GO :CONNECT 10.2 .6 .10 Usa PthanVitha@2015 RESTORE DATABASE SQLShackLogShipping FROM DISK = ‘/var/opt/SQLBackup/SQLShackLogShippingFull.Bak’ WITH MOVE ‘SQLShackLogShipping’ TO ‘/var/opt/mssql/data/SQLShackLogShipping.mdf’ ,MOVE ‘SQLShackLogShipping_Log’ TO ‘/var/opt/mssql/data/SQLShackLogShipping_Log.ldf’ ,NORECOVERY ,STATS = 5 , GO :CONNECT 10.2 .6 .10 Usa PthanVitha@2015 RESTORE LOG SQLShackLogShipping FROM DISK = ‘/var/opt/SQLBackup/SQLShackLogShippingLogTRN’ WITH NORECOVERY ,STATS = 5; GO

    On the primary server,

    1. Add the primary database by executing sp_add_log_shipping_primary_database. This step generates the backup job ID and primary ID
    2. Add a backup job schedule using sp_add_jobschedule
    3. Enable the backup job.
    :CONNECT HQ6021 DECLARE @LSBackupJobId AS UNIQUEIDENTIFIER DECLARE @LSPrimaryId AS UNIQUEIDENTIFIER DECLARE @SPAddRetCode AS INT EXEC @SPAddRetCode = master.dbo.sp_add_log_shipping_primary_database @database = N’SQLShackLogShipping’ ,@backupdirectory = N’\\centos\SQLShare\’ ,@backupshare = N’\\centos\SQLShare\’ ,@backupjobname = N’LSBackupSQLShackLogShipping’ ,@backupretentionperiod = 4320 ,@backupcompression = 2 ,@backupthreshold = 60 ,@thresholdalertenabled = 1 ,@historyretentionperiod = 5760 ,@backupjobid = @LSBackupJobId OUTPUT ,@primaryid = @LSPrimaryId OUTPUT ,@overwrite = 1 IF ( @@ERROR = 0 AND @SPAddRetCode = 0 ) BEGIN DECLARE @LSBackUpScheduleUID AS UNIQUEIDENTIFIER DECLARE @LSBackUpScheduleID AS INT EXEC msdb.dbo.sp_add_schedule @schedulename = N’LSBackupSchedule’ ,@enabled = 1 ,@freqtype = 4 ,@freqinterval = 1 ,@freqsubdaytype = 4 ,@freqsubdayinterval = 15 ,@freqrecurrencefactor = 0 ,@activestartdate = 20170418 ,@activeenddate = 99991231 ,@activestarttime = 0 ,@activeendtime = 235900 ,@scheduleuid = @LSBackUpScheduleUID OUTPUT ,@scheduleid = @LSBackUpScheduleID OUTPUT EXEC msdb.dbo.sp_attach_schedule @jobid = @LSBackupJobId ,@scheduleid = @LSBackUpScheduleID EXEC msdb.dbo.sp_update_job @jobid = @LSBackupJobId ,@enabled = 1 END EXEC master.dbo.sp_add_log_shipping_alert_job GO EXEC master.dbo.sp_add_log_shipping_primary_secondary @primarydatabase = N’SQLShackLogShipping’ ,@secondaryserver = N’10.2.6.10′ ,@secondarydatabase = N’SQLShackLogShipping’ ,@overwrite = 1; GO

    On the secondary server,

    1. The primary server and database details are passed to the stored procedure sp_add_log_shipping_secondary_primary. This stored procedure returns the secondary ID and the backup copy ID and restore job ID.
    2. Set the copy and restore job schedule using sp_add_jobschedule
    3. Add the secondary database to the instance by executing sp_add_log_shipping_secondary_database.
    4. On the primary server, execute sp_add_log_shipping_primary_secondary to add the required information about the new secondary database to the primary server.
    5. Enable the copy and restore jobs
    :CONNECT 10.2 .6 .10 Usa PthanVitha@2015 DECLARE @LSSecondaryCopyJobId AS UNIQUEIDENTIFIER DECLARE @LSSecondaryRestoreJobId AS UNIQUEIDENTIFIER DECLARE @LSSecondarySecondaryId AS UNIQUEIDENTIFIER DECLARE @LSAddRetCode AS INT EXEC @LSAddRetCode = master.dbo.sp_add_log_shipping_secondary_primary @primaryserver = N’HQ6021′ ,@primarydatabase = N’SQLShackLogShipping’ ,@backupsourcedirectory = N’/var/opt/SQLbackup/’ ,@backupdestinationdirectory = N’/var/opt/SQLbackup/’ ,@copyjobname = N’LSCopySQLShackLogShipping’ ,@restorejobname = N’LSRestoreSQLShackLogShipping’ ,@fileretentionperiod = 4320 ,@overwrite = 1 ,@copyjobid = @LSSecondaryCopyJobId OUTPUT ,@restorejobid = @LSSecondaryRestoreJobId OUTPUT ,@secondaryid = @LSSecondarySecondaryId OUTPUT IF ( @@ERROR = 0 AND @LSAddRetCode = 0 ) BEGIN DECLARE @LSSecondaryCopyJobScheduleUID AS UNIQUEIDENTIFIER DECLARE @LSSecondaryCopyJobScheduleID AS INT EXEC msdb.dbo.sp_add_schedule @schedulename = N’DefaultCopyJobSchedule’ ,@enabled = 1 ,@freqtype = 4 ,@freqinterval = 1 ,@freqsubdaytype = 4 ,@freqsubdayinterval = 15 ,@freqrecurrencefactor = 0 ,@activestartdate = 20171108 ,@activeenddate = 99991231 ,@activestarttime = 0 ,@activeendtime = 235900 ,@scheduleuid = @LSSecondaryCopyJobScheduleUID OUTPUT ,@scheduleid = @LSSecondaryCopyJobScheduleID OUTPUT EXEC msdb.dbo.sp_attach_schedule @jobid = @LSSecondaryCopyJobId ,@scheduleid = @LSSecondaryCopyJobScheduleID DECLARE @LSSecondaryRestoreJobScheduleUID AS UNIQUEIDENTIFIER DECLARE @LSSecondaryRestoreJobScheduleID AS INT EXEC msdb.dbo.sp_add_schedule @schedulename = N’DefaultRestoreJobSchedule’ ,@enabled = 1 ,@freqtype = 4 ,@freqinterval = 1 ,@freqsubdaytype = 4 ,@freqsubdayinterval = 15 ,@freqrecurrencefactor = 0 ,@activestartdate = 20171108 ,@activeenddate = 99991231 ,@activestarttime = 0 ,@activeendtime = 235900 ,@scheduleuid = @LSSecondaryRestoreJobScheduleUID OUTPUT ,@scheduleid = @LSSecondaryRestoreJobScheduleID OUTPUT EXEC msdb.dbo.sp_attach_schedule @jobid = @LSSecondaryRestoreJobId ,@scheduleid = @LSSecondaryRestoreJobScheduleID END DECLARE @LSAddRetCode2 AS INT IF ( @@ERROR = 0 AND @LSAddRetCode2 = 0 ) BEGIN EXEC @LSAddRetCode2 = master.dbo.sp_add_log_shipping_secondary_database @secondarydatabase = N’SQLShackLogShipping’ ,@primaryserver = N’HQ6021′ ,@primarydatabase = N’SQLShackLogShipping’ ,@restoredelay = 0 ,@restoremode = 0 ,@disconnectusers = 0 ,@restorethreshold = 45 ,@thresholdalertenabled = 1 ,@historyretentionperiod = 5760 ,@overwrite = 1 END IF ( @@error = 0 AND @LSAddRetCode = 0 ) BEGIN EXEC msdb.dbo.sp_update_job @jobid = @LSSecondaryCopyJobId ,@enabled = 1 EXEC msdb.dbo.sp_update_job @jobid = @LSSecondaryRestoreJobId ,@enabled = 1 END

    SSMS Report

    The standard Log Shipping Status report can be generated by right clicking on the Server Name in the SQL Server Management Studio Reports Standard Reports Transaction Log Shipping Status. This report should give us details which should tell us whether our setup was successful.

    Wrapping Up

    As we saw in this article, the high availability feature, Log Shipping, can now be configured across platforms. This article provides step by step details of the configuration to implement Log Shipping from the primary SQL Server running on Windows to the secondary SQL Server running on CentOS. The steps are not very different from configuring Log Shipping across Windows instances, except that the interface (and so some of the commands) varies. You may get a few errors, but all the errors are going to be self-explanatory. If you can fix the Log Shipping setup on Windows, you should be able to troubleshoot configuration issues on Linux as well, as the steps should be straightforward. It is also evident that SQL Administrators have to quickly adapt to the changing needs of technology.

    Prashanth Jayaram
168 Views