Daniel Calbimonte

How to create SQL Server Log Shipping

March 10, 2015 by

In my last article, I show how to create Database Mirroring for high availability. This time, I will work on Log Shipping.

The Log Shipping as the name says ships the Transactional Log from the Primary Servers to the Secondary Servers.

The Log Shipping process consists in creating a Transaction Log Backup on the Primary Server, Copy the Log Backup to the secondary and restore the database.

Servers used

The SQL Server Log Shipping, requires a Primary Server, which will be used as the main Server with a database. If it fails, the other server (Secondary Server) will be used to replace the Principal Server. Optionally you can use a third server to monitor the Log Shipping. Log shipping applies to a select database.

Once the principal is fixed, you can return using the Principal server.

If you want an automatic failover (when the principal server fails, the Mirror Server will be automatically activated to replace the principal server), a third Server is necessary.

Requirements

  1. You need three SQL Servers or at least three SQL Server Instances (three instances can be used for testing purposes only, but it is not recommended for production environments). In this sample we will use 2 Servers.
  2. You can use SQL Server Enterprise, Standard, Business Intelligence or Web Edition for this article.
  3. Create a folder and assign permissions to the SQL Server Agent Account to this folder on the principal server. In this example, the folder name will be logshipping in the c:\ drive.
  4. Create a folder and assign permissions to the SQL Server Agent Account to this folder on the secondary server. In this example, the folder name will be destination in the c:\ drive.

Getting Started

  1. In the SQL Server Primary Server, right click on the database to LogShip and select Tasks>Ship Transaction Logs.


    Figure 1. Ship Transaction Logs option

  2. Press The Backup Settings option


    Figure 2. Ship Transaction Logs option

  3. Specify the network path and the local path in the primary server of the folder where the Transaction log backups will be stored. You can rise an alert if the backup does not occur and delete older files within a determined time. The Job name is LSBackup_db (Log Shipping Backup). With the Schedule button you can Schedule de backup every to the time of your preference. By default, it is scheduled every 15 minutes. You can also Disable the job with this Window or compress the backup.


    Figure 3. Setting up the folder with backups

  4. In order to add a secondary Server, press the Add button. You can add multiple Secondary Servers in Log Shipping. Additionally, you can use a third instance to monitor the Log Shipping process by checking the Use a monitor server instance and specifying the settings. In this sample, we will not include a Server Monitor. In this example, we are not going to include a Monitor.


    Figure 4. Setup the backup settings

  5. In the Secondary Database Setting Window, press Connect and specify your Secondary Server credentials. Once connected, the options to initialize the secondary database will be enabled.


    Figure 5. Connect to the secondary server

  6. In the Initialize Secondary Database Tab, if you do not have a backup already created on the secondary machine from the primary machine, select the option yes to generate a full backup. If you already have a backup, select the option Yes, restore an existing backup. With the Restore Options, you can select the restore configurations. If you have an already restored database on the secondary server, select the No, the secondary database is initialized. In this example, the first option will be used.


    Figure 6. Initialize Secondary Database Settings.

  7. In the Copy Files tab, select the destination folder. In the destination folder, you will have the Transaction log backups created and copied from the primary server. You can delete copied files after a specified time. The folder was created in the requirements of this document. You can delete files copied after a specified period and schedule the copies according to your preferences. The job created will be LSCopy_MachineName_DatabaseName. By default, the copy is programmed every 15 minutes.


    Figure 7. Copy Files Tab

  8. The third tab is used to restore the transactional logs. There are two restoring options:
    • No recovery mode. Is an option to restore the Transaction Logs in a non-recovery status. This option is faster because it does not need to analyze the uncommitted transactions. This option does not allow querying the database from the Secondary Server.
    • Standby mode. This option allows you to have a read only Database on the secondary database, this option has a higher overhead that the non-recovery node, but it has the benefit that you can access the Database. You can also disconnect the users when restoring the backups to increase the performance. In this example, we will use this option.

    You can also have a delay restoring backups. That is in case that the principal server has an error and we do not want to restore the backup with that problem. You also can configure an alert if the restoration fails.

    The job name is LSRestore and you can schedule the restore option with the schedule button according to your needs.

    By default, the restoration is every 15 minutes.


    Figure 8. Restore Transaction Log settings

  9. If everything is OK, you will receive Success status messages. The typical error messages are related to folder permissions. If you have problems, make sure that the agent service has permissions to the folders configured.


    Figure 9. The success message.

  10. If everything is OK, you will be able to see a Standby / read only database on the secondary server.


    Figure 10. The read only database on the Secondary Server

Now you are ready. You have a Read only database that synchronizes with the primary Database.

If you update information on tables of the Primary Server in the db1 database, this information will be syncronized on the secondary server after some minutes (depending on the schedules that you programmed to backup, copy and restore).

If you go to the SQL Server Agent of the primary database, you will see LSAlert job which is an alert if the backup fails and the LSBackup to create a backup of the primary database.

The secondary Server has 3 jobs associated, the LSAlert if the restoration fails, the LSCopy to copy the backups from the primary server to the secondary server and finally the LSRestor to restore the database.

See more

To learn more about utilizing log shipping, we suggest How to create a simple database recovery solution using SQL Server log shipping

For SQL Server recovery, consider ApexSQL Recover, a tool that recovers deleted and truncated data, objects and data lost due to drop operations, restores deleted and reads online BLOBs as files

References

For more information, refer to these links:


Daniel Calbimonte

Daniel Calbimonte

Daniel Calbimonte is a Microsoft Most Valuable Professional, Microsoft Certified Trainer and Microsoft Certified IT Professional for SQL Server. He is an accomplished SSIS author, teacher at IT Academies and has over 13 years of experience working with different databases.

He has worked for the government, oil companies, web sites, magazines and universities around the world. Daniel also regularly speaks at SQL Servers conferences and blogs. He is also a writer for SQL Server training material for certification exams.

View all posts by Daniel Calbimonte
Daniel Calbimonte
Recovery, Transaction Log

About Daniel Calbimonte

Daniel Calbimonte is a Microsoft Most Valuable Professional, Microsoft Certified Trainer and Microsoft Certified IT Professional for SQL Server. He is an accomplished SSIS author, teacher at IT Academies and has over 13 years of experience working with different databases.

He has worked for the government, oil companies, web sites, magazines and universities around the world. Daniel also regularly speaks at SQL Servers conferences and blogs. He is also a writer for SQL Server training material for certification exams.

View all posts by Daniel Calbimonte

7,286 Views
  • why all windows admin refer GUI based log shipping, what is the steps if we want to setup the log shipping from the queries only by using the script