Features

Prashanth Jayaram

How to set up a DDL and DML SQL Server database transactional replication solution

August 22, 2018 by

In the article How to setup SQL Server database replication for reporting Server we discussed all about setting up a simple SQL Server transactional replication environment. In this article, we’ll simulate a more robust system that replicates faster and includes both DDL and DML statements. We’ll walk-through the steps to setup a dynamic transactional replication.

Read more »
Gerald Britton

Shrinking your database using DBCC SHRINKFILE

August 16, 2018 by

Introduction

SQL Server is pretty good at managing disk space. As long as we do our part to set up appropriate storage types and place files and filegroups properly and set reasonable AUTOGROW settings, it’s almost a set-it-and-forget-it operation. Mind you, I said, “almost!” Sometimes, things do go BUMP! in the night and we need to act. Here’s what happened to me not too long ago:

Read more »
Prashanth Jayaram

SQL Server database migrations with zero data loss and zero downtime

August 15, 2018 by

The growing importance and complexity of data migration, in an era of exploding data volumes and ever-changing business requirements, means that old approaches will no longer get the job done. We are in a world where everything needs to run instantly. Every Database Administrator or Developer would have definitely heard about database migrations with zero downtime and with zero data loss.

Read more »
Prashanth Jayaram

The BCP (Bulk Copy Program) command in action

August 13, 2018 by

There are various methods available for bulk data operations.

  1. BCP utility
  2. BULK INSERT
  3. Using OPENROWSET
  4. Import/Export wizard

The BCP (Bulk Copy Program) utility is a command line that program that bulk-copies data between a SQL instance and a data file using a special format file. The BCP utility can be used to import large numbers of rows into SQL Server or export SQL Server data into files. The BCP data files don’t include any schema details or format information. Hence, it is recommended to create a format file to record the data format so in case of any failures, you can refer to the format file and better understand the data format to determine what may have gone wrong..

We’ve been using the BCP tool for a long time, the reason being that it has a very low overhead, and works great for bulk exporting and importing of data. It is one of the most efficient ways to handle bulk import and export of data.

Read more »
Prashanth Jayaram

Using transactional data replication to replay and test production loads on a staging server

August 9, 2018 by

Outline

In this article, you’ll see how to simulate production loads on a test server with a “record and replay” type situation using the transaction log, batch scripting, PowerShell and a SQL Server agent job.

We’ll be walking through the scenario in the following steps

  1. Record the production load and write the transactions to disk by generating a timestamped replay script
  2. Create a batch file to automate the task at an interval of every 1 minute
  3. Create a SQL Server agent job to schedule the batch file
  4. Replay the production workload to the target/test database by running a PowerShell script to open and execute the scripts at the same interval as they were created, every 1 minute
  5. Validate the data between the source and the target databases to make sure our job works
  6. Monitor the load with a monitoring tool, solution of your choice
Read more »
Aamir Syed

Use cases for Query Store in SQL Server

July 18, 2018 by

Query store was introduced in SQL Server 2016. It is often referred to as a “flight data recorder” for SQL Server. Its main function is that it captures the history of executed queries as well as certain statistics and execution plans. Furthermore, the data is persistent, unlike the plan cache in which the information is cleared upon a server restart or reboot. You can customize, within Query Store, how much and how long the query store can hold the data.

Read more »
Nesha Maric

Reading the transaction log in SQL Server – from hacks to solutions

July 3, 2018 by

The SQL Server transaction log is akin to a ‘Black box’ in an airliner. It contains all of the records of transactions made against a database. This information is a proverbial goldmine for database audits, recoveries etc but it was never meant to be exposed to end users let alone visualized in an easy to read manner nor used for DBA tasks. As such, utilizing this information can be a challenge, to say the least.

Read more »
Dmitry Piliugin

SQL Server 2017: Adaptive Join Internals

April 30, 2018 by

SQL Server 2017 brings a new query processing methods that are designed to mitigate cardinality estimation errors in query plans and adapt plan execution based on the execution results. This innovation is called Adaptive Query Processing and consist of the three features:

  • Adaptive Memory Grant Feedback;
  • Interleaved Execution;
  • Adaptive Joins.
Read more »
Dmitry Piliugin

SQL Server 2017: Interleaved Execution for mTVF

April 27, 2018 by

In this post, we are going to look at the new feature in SQL Server 2017 – interleaved execution. You need to install SQL Server 2017 CTP 1.3 to try it, if you are ready, let’s start.

Now, when a CTP 2.0 of SQL Server 2017 is out, you don’t need to turn on the undocumented TF described further, and the plans are also different, so the examples from this post use CTP.1.3, probably not actual at the moment (I was asked to hold this post, until the public CTP 2 is out, and interleaved execution is officially announced). However, the post demonstrates Interleaved execution details and might be still interesting.

Read more »
Dmitry Piliugin

SQL Server 2017: Columnstore in-place updates

April 26, 2018 by

In this post, I continue the exploration of SQL Server 2017 and we will look at the nonclustered columnstore index updates.

Columnstore index has some internal structures to support updates. In 2014 it was a Delta Store – to accept newly inserted rows (when there will be enough rows in delta store, server compresses it and switches to Columnstore row groups) and a Deleted Bitmap to handle deleted rows. In 2016 there are more internal structures, Mapping Index for a clustered Columnstore index to maintain secondary nonclustered indexes and a deleted buffer to speed up deletes from a nonclustered Columnstore index.

Updates were always split into insert + delete. But that is now changed, if a row locates in a delta store, now inplace updates are possible. Another change is that it is now possible to have a per row (narrow) plan instead of per index (wide) plan.

Let’s make some experiments.

Read more »
SQLShack

Top SQL Server Books

April 25, 2018 by
This collection of content will provide an overview of top SQL Server books, with Amazon links for each book. The list is a combination of top Amazon SQL Server books (by the highest number of customer reviews) and the books included on other available ‘Top SQL Server books’ lists (see reference links at the bottom)

If you know of a book that deserves to make this list, please let us know in the comments below.

Read more »
Dmitry Piliugin

SQL Server 2017: Columnstore Indexes and Trivial Plan

April 25, 2018 by

Some time ago, SQL Server 2017 was released and issued as CTP. The most exciting release in that CTP was that SQL Server now supports Linux! This is awesome and I consider it to be great news for many people.

I am personally interested in the new features of query processing, and finally I had some time to install the SQL Server 2017 and dig a little bit into it. Currently, it is CTP 1.2 available, and I will use this version for my experiments.

While exploring new extended events, I’ve found an interesting event compilation_stage_statistics and one of the columns of this event was trivial_plan_scanning_cs_index_discarded with the following description “Number of trivial plans discarded or could have been discarded which scan Columnstore index”. That pushed me to do some investigations of the topic.

Read more »
Ed Pollack

Troubleshooting Database Mail Failures

April 25, 2018 by

Description

Database Mail is a convenient and easy way to send alerts, reports, or data from SQL Server. Failures are not obvious to the us though, and developing a process to monitor these failures alongside other failures will save immense headaches if anything ever goes wrong.

Database Mail: a (very) brief overview

Database Mail is a component of SQL Server that is available in every edition, except for Express. This feature is designed to be as simple as possible to enable, configure, and use.

Database Mail relies on SMTP to send emails via a specified email server to any number of recipients. When configuring, you provide a mail server, credentials (if needed), and then the service is ready to use. We’ll be focusing here on failure reporting and not configuration. If you need help setting up or configuring this feature, check out some of the references at the end of this article.

Read more »
Daniel Calbimonte

The history of SQL Server – the evolution of SQL Server features

February 2, 2018 by

Introduction

This article will explain the main features in SQL Server 2017, 2016, 2015, 2014, 2012, 2008, 2005, 2000, 7, 6.5, 6.0, 4.2, 1.1 and 1.0.

In the past, the first SQL Server versions supported OS/2 (an operative system created by Microsoft and IBM) and Windows.

Now, the new versions of SQL Server (vNext and SQL Server 2017) can be installed in Linux. 15 years ago, it was impossible to think that. Linux and Microsoft were just like oil in water and now, Microsoft loves Linux.

Also, we now enjoy full integration with Azure, Tabular Databases, SSIS, SSAS and more. In this article, we will talk about all these changes and improvements.

Read more »
Marko Zivkovic

SQL Server Query Store – Overview

December 29, 2017 by

The SQL Server Query Store is a relatively new feature introduced in SQL Server 2016. It is basically a SQL Server “flight recorder” or “black box”, capturing a history of executed queries, query runtime execution statistics, execution plans etc. against a specific database. This information helps in identifying performance problems caused by query plan changes and troubleshooting by quickly finding performance differences, even after SQL Server restart or upgrade. All data that SQL Server Query Store capture are stored on disk.

Read more »
Prashanth Jayaram

6 methods to write PowerShell output to a SQL Server table

December 25, 2017 by

PowerShell has become the ultimate choice for many database administrators because of its efficient way of handling and managing automation in a simple, quick way. It’s built on .NET Framework and uses Object Models such as COM, ADSI, ADO, and WMI. PowerShell has replaced the traditional way of scripting that used many legacy scripting practices to monitor SQL instances.

I’ve been asked on several occasions about how to store the output of PowerShell WMI data into the SQL table. The question comes up so frequently that I decided to write this article.

When sending data within a system (such as a PowerShell object to a cmdlet), the process is straightforward. However, with non-native data interchange (for instance, WMI to SQL), the process can potentially get complicated. Due to this, many purists suggest sticking to simple interchange formats, such as CSV, JSON or in some cases, XML.

Read more »
Esat Erkec

How to use parallel insert in SQL Server 2016 to improve query performance

December 8, 2017 by

Introduction

In the first part of this article, we will discuss about parallelism in the SQL Server Engine. Parallel processing is, simply put, dividing a big task into multiple processors. This model is meant to reduce processing time.

  • SQL Server can execute queries in parallel
  • SQL Server creates a path for every query. This path is execution plan
  • The SQL Server query optimizer creates execution plans
  • SQL Server query optimizer decides the most efficient way for create execution plan

Execution plans are the equivalent to highways and traffic signs of T-SQL queries. They tell us how a query is executed.

Read more »
Bojan Petrovic

How to configure Database Mail in SQL Server

November 22, 2017 by

Database Mail, as you would expect from its name, is a solution for sending e-mail messages from the SQL Server Database Engine to users. Using Database Mail, database applications can send e-mail messages that can, for example, contain query results or simply alert a user about an event that occurred in the database.

The process of Database Mail configuration has three main steps. In order to complete this successfully, we need to:

  • create a Database Mail account,
  • create a Database Mail profile,
  • and configure those two to work together
Read more »