We hear a lot about hybrid SQL Server environments, but how to integrate my datacenter with Azure? Check this article and you will have a simple and effective explanation about the connectivity option!Read more »
In a past chat back in January 2015, we started looking at the fantastic suite of data mining tools that Microsoft has to offer. At that time, we discussed the concept of a data mining model, creating the model, testing the data and running an ad-hoc DMX query. For those folks that may have missed this article, the link may be found immediately below;Read more »
A few days ago I received an interesting challenge from one of our clients. The lady was attempting to estimate her potential monthly revenue recognition for the fiscal year beginning January 1, 2015, through December 31, 2015. The lady named Linda sells goods and services (each class yielding differing sales margins).
In the first portion of this two-part discussion, we shall be looking at the revenue projections for goods.Read more »
The first time that you create your VM in Azure manually using the UI is a very pleasant experience. However, when you have hundred and sometimes thousands of computers it is extremely exhausting and boring to create and configure it.
With PowerShell, it is possible to automate several administrative tasks and create scripts to automatically create VMs, enable ports, download and create remote desktop files, administer services, etc.
In this new chapter, we will show how to create a Virtual Machine in Azure with SQL Server installed using PowerShell.Read more »
In-memory OLTP is a revolutionary tool introduced on SQL Server 2014. On SQL Server 2016 this will be even better, with a broader surface of supported tools. Check this article to know the news.Read more »
A few days ago I received an email from a gentleman in the healthcare industry. He had read an article that I had previously published a few years back on a well known SQL Server website. Based on the original article, he requested a few ideas on how to expand the sample code to do a rolling three-month revenue summary, in addition to a rolling three-month revenue average.
In today’s get together, we shall be looking at doing just this!Read more »
Sometimes we need to move our local files, SQL scripts, backups from our local machine to Azure or vice versa. This can be done manually by accessing to Azure and using a browser, but there are other methods to automatically do this. This article describes the Microsoft Azure Storage Tool. This is a command line tool used to upload data to Azure from a local machine or to download data from Azure to our local machine. The article will describe step by step how to work with this tool.Read more »
We have already configured our Availability Group, now we need to make it flexible and accessible. It’s time to check on how to create a listener in order to make a single access point for you AG!
In continuation to our previous article, we are going to pass to another phase of this setup, as we already have our database in sync and safe, or highly available, depending of the chosen mode/architecture.Read more »
Every organization has at least one of those guys. You are the guy if you have been with the company long enough that you are very familiar with all of the servers in multiple environments. You can easily translate the SharePoint server into its server name, CL-DB-001-B\SP_2010 and you know all of the DNS aliases pointing to that server as well. Your super power also makes you a magnet for everyone in IT to come and ask you where this database is and what is on that server.Read more »
PowerShell is a shell used specially to automate administrative tasks.
It is an incredible tool used to programmatically automate tasks like SQL Server tasks, Exchange tasks, Windows tasks, etc. It is very useful to integrate different Microsoft and sometimes non-Microsoft programs.
In this new tutorial, we will show how to install PowerShell for Azure and then how to use it. We will create some databases, edit database properties and retrieve database information using PowerShell.Read more »
It has been a while since the last transaction log article has been published, so I hope you do remember where this series is heading to. In the former posts, we have examined the Log Structure and Write-Ahead Algorithm (part 1) and the Top Reasons for Log Performance Problems (part 2). Taking into consideration this knowledge, we will review some best-practices for transaction log configuration in order to decrease the chance of experiencing log bottlenecks.Read more »
In this article we are going to explore how to configure an Availability Group between a clustered instance and a standalone instance, showing, step-by-step, how to setup a possible Disaster Recovery environment.
Introduced on SQL Server 2012, the Availability Groups brought the expectation to be an improved version of the database mirroring, which will be discontinued soon. The AlwaysOn Availability Groups was improved on SQL Server 2014, giving the capability of have more replicas, better troubleshooting possibilities and improving its availability. Comparing the Availability Groups with the database mirroring, in a very high level, we gained the possibility of have a listener to dynamically redirect the connection to the current active instance and also the capability of distribute the read workload between readable replicas. However, only the primary replica is able to write.Read more »
The sqlcmd is a very powerful tool to run SQL Server scripts, T-SQL commands. It is also very useful in disaster recovery situations like restoring the master database.
The sqlcmd can also be used in the Cloud, specifically with SQL Server Azure. In this new article, we will describe how to connect from a local machine to an Azure Virtual Machine (VM) with SQL Server Installed using sqlcmd.Read more »
This article is a part of three articles series to explore SQL Concatenation techniques.
Having to represent sets of data as strings is a very common requirement in information management, even in modern times where a variety of more or less elaborate standards for storing, and moving, data are at our disposal. For instance, XML, JSON, or similar techniques, allow the data to be extracted from one data source, using a well-known standard, and be stored temporarily until being loaded into a destination data store, or until being consumed in some other way. Actually, both XML as well as JSON might even be used as a standard way of storing data permanently; especially, if the consumers expect the data to use one or the other format.Read more »
A few days back I was looking at ways to access raw data from within Microsoft Dynamics CRM in an effort to extract the raw data and to place it in our data warehouse. I started to explore utilizing OData and SSIS to pull the necessary data from the cloud to our local warehouse.
Whilst there are known authentication issues between Dynamics CRM and the Microsoft OData SSIS data source (and thus we could not utilize this access method), I thought it to be so very powerful, that I began looking for other constructive manners in which to utilize the OData Source.Read more »
In my last chapter, How to migrate your database to an Azure Virtual Machine I showed you the steps to connect to a SQL Server in an Azure VM by using SQL Server Management Studio (SSMS). In this new chapter, we will show you how to work with SQL Server Integration Services (SSIS) to export a local table and its data to an Azure Virtual Machine (VM) with SQL Server installed.Read more »
In one of my articles about Microsoft Azure, I show how to create credentials and how to connect to an Azure Database using SQL Server Management Studio (SSMS). In this new article, we will talk about the Azure Virtual Machines and we will learn how to connect to them using our local SSMS. You will require unblocking ports, add endpoints and other tasks that I will explain later. We will also learn how to export a local database to an Azure Virtual Machine with SQL Server installed.Read more »
A few days back I encountered an interesting challenge. The client wanted to have copies of the nightly backups of the transactional databases restored on a warehouse server, to be utilized to update the warehouse.
The over all process
Prior to the pushing the daily backup to the warehouse server, the previous days restore is deleted. The important point being that the “SQLShackFinancial” database is no longer present on the warehouse server. Having been deleted, downloading of the backup file begins and the restore of the current backup version begins. Normal warehouse processing then ensues and so the cycle continues.Read more »
In my last article entitled “Which fields do my reports actually use”, we had a quick look at a practical implementation of an XPATH query to obtain a list of database table fields being utilized by our reports. Reporting Services reports are really no more than XML documents. The fields utilized by our reports reside in Reporting Services datasets.Read more »
Modeling for the xVelocity/Vertipaq engine is a completely different beast than modeling for your trusty multi-dimensional SSAS cubes.
In-memory = blazingly fast; At least that’s what you would think.
As Tabular models gain popularity with business users and developers alike, we’re starting to see that this isn’t always the case.
We’re going to take a look at some of the common errors and mistakes and how to avoid them.
And since the PowerPivot engine is the same – you will learn how to tune your PowerPivot-based Excel workbooks as well.
Have you ever felt like pulling your hair out, trying to ascertain exactly which fields in your existing Reporting Services datasets are being utilized by your reports. This happened to me recently during a corporate conversion and cleanup exercise for a database migration to the cloud.
The “aha moment” came after having presented a paper at the PASS SQL Server Nordic Rally (March 2015), when one attendee came up to me and asked if I knew of a method to do this. As they say ‘necessity is the mother of invention’ and spiking my interest, I played around until I came up with the solution that we are going to chat about today. The end solution may be seen belowRead more »
Oft times we are forced into situations where we must clearly think outside of the box. In today’s “get together”, we are going to discuss a challenge that I encountered during the last week of February of this year. The client had been charting weekly business calls placed by his sales reps. Our client had been tracking these results within an Excel spreadsheet (see the screen dump below) and he would be using this spreadsheet to report the sales reps progress going forward. My task was to source this data for the corporate reports in Reporting Services, from this spreadsheet and do so on a weekly basis. The client, being resistant to change, was not willing to change the format of the spreadsheet to something more conducive to be utilized by the chart that he wished to produce (see immediately below).Read more »