Steve Simon

Steve Simon

Steve Simon is a SQL Server MVP and a senior BI Development Engineer with Atrion Networking. He has been involved with database design and analysis for over 29 years.

Steve has presented papers at 8 PASS Summits and one at PASS Europe 2009 and 2010. He has recently presented a Master Data Services presentation at the PASS Amsterdam Rally.

Steve has presented 5 papers at the Information Builders' Summits. He is a PASS regional mentor.

View all posts by Steve Simon

Latest posts by Steve Simon (see all)

Big Bonus Anyone?

December 19, 2016 by

Introduction

Late in October, I received an unusual request from the head of sales within one of my client sites. Sales sells three articles: bread, perfume and Jaguar motor cars. Now the reader will note that one of these items is a staple and the other two are for those folks with considerable disposable income. Management within the firm had increased the salesmen’s bonuses for those folks that managed to sell perfume and / or Jaguars along with the standard loaves of bread. The summary report may be seen below showing the final bonus rate for each sales order booked during the month.

Read more »

A story of Whigs and Tories

December 9, 2016 by

Introduction

As many of you by now know, I am a fan of utilizing expressions within Reporting Services reports to add additional flexibility. Recently I received a client request to create a stacked bar report which in turn would provide access to the underlying data the made up the bars on the chart. My client sells two articles: “Whigs” and “Tories”. Many people like to “buy” them .The idea is to display the aggregated data as may be seen below (grey black and turquoise) and depending upon which bar and color is selected( clicked upon), to drill down and to display the underlying detailed data (see below).

Read more »

With a little help from my friends (DQS)

November 9, 2016 by

Introduction

An interesting opportunity arose at a client site during early October which provided a phenomenal opportunity to do a Data Quality Services implementation. My client (a grocer) had been requested to produce summary reports detailing the amount of funds spent during 2016 (YTD) with the myriad of manufacturers from whom the chain purchases their inventory. All “accounts payable” entries are done manually and as such are prone to errors.

Read more »

A festive gift: Working with images

November 2, 2016 by

Introduction

With Christmas just around the corner, in today’s “get together” I thought that we would have some fun by cataloging a collection of ‘your favorite items’. Whether it be a coin collection, a china plate collection or a stamp collection, the process is the same and certainly something that you will enjoy creating and maintaining. For today’s example, we are going to construct a “Postage Stamp” cataloguing system. We are going to see how we are able to get from this..

Read more »

Reporting from OLAP cubes made easy

October 19, 2016 by

Introduction

Last month I ran two Business Intelligence pre-conferences in South Africa. A interesting request arose during the course of the preconference in Cape Town. The individual wanted an approach to extracting data from an OLAP cube that would avoid intensive utilization of MDX and more reliance upon T-SQL. His main concern was with filtering the data at run time, via the report front end.

In this “fire side chat” we shall do just that, utilizing the cube that comes with the new Microsoft database “WideWorldImporters” and we shall learn how we can get from this

Read more »

Bars and stripes: Visual displays that you can count on!

October 3, 2016 by

Introduction

As we are nearing the end of the North American summer, I thought that we would take a lighter look at nifty ways of reporting information. In today’s “fireside chat” we have a look at a unique way of displaying our information using charts and line graphs all in one single chart (see below). We shall take things one step further and work with the colour fill of the vertical bars to reflect the values that they represent.

Read more »

Smart charts and tablices

August 4, 2016 by

Introduction

In our last “fire side chat” we discussed a few of the challenges that the HR Manager of a major hardware chain was experiencing. Mary Smith, the HR manager has since approached us to modify her existing reports to function more efficiently and effectively by utilizing her existing data, yet reduce the total number of reports.

Read more »

Reporting services and linked servers wow!

July 27, 2016 by

Introduction

With today’s challenging economic times it has become more and more important to manage and rectify changing sales patterns and trends.

In today’s “get together” we shall be expanding our outlook by creating efficient and effective reports utilizing SQL Server Reporting Service 2016 and T-SQL, together with the DAX code that we created in our last “fire side chat”.

Read more »

Reporting services and the tabular model on steroids

July 20, 2016 by

Introduction

A few weeks back I had been working on in interesting proof of concept for a client within the food / grocery industry. The objectives were to be able to provide the client with information on sales patterns, seasonal trends and location profitability. In our previous “get-together” we discussed how to create a tabular model project and how to create efficient and effective reports utilizing Excel.

Read more »

Beer and the tabular model DO go together

July 12, 2016 by

Introduction

A few weeks back I had been working on an interesting proof of concept for a client within the food / grocery industry. The objectives were to be able to provide the client with information on sales patterns, seasonal trends and location profitability. The client was an accountant and was therefore comfortable utilizing spreadsheets. This said, I felt that this was a super opportunity to build our proof of concept utilizing a SQL Server Tabular Solution and by exploiting the capabilities of Excel and Power Reporting for the front end.

Read more »

Grab your pick and shovel and let’s mine!! Creating productive and informative mining reports!

June 10, 2015 by

Introduction

In a past chat back in January 2015, we started looking at the fantastic suite of data mining tools that Microsoft has to offer. At that time, we discussed the concept of a data mining model, creating the model, testing the data and running an ad-hoc DMX query. For those folks that may have missed this article, the link may be found immediately below;

Read more »

BI at its very best!!! Revenue projection to keep us on course

June 5, 2015 by

A few days ago I received an interesting challenge from one of our clients. The lady was attempting to estimate her potential monthly revenue recognition for the fiscal year beginning January 1, 2015 through December 31, 2015. The lady named Linda sells goods and services (each class yielding differing sales margins).

In the first portion of this two part discussion, we shall be looking at the revenue projections for goods.

Read more »

BI at its very best!!! Working with rolling averages

May 27, 2015 by

Introduction

A few days ago I received an email from a gentleman in the health care industry. He had read an article that I had previously published a few years back on a well know SQL Server website. Based upon the original article, he requested a few ideas on how to expand the sample code to do a rolling three month revenue summary, in addition to a rolling three month revenue average.

In today’s get together, we shall be looking at doing just this!

Read more »

OData and SSIS a marriage made in the clouds

April 6, 2015 by

Introduction

A few days back I was looking at ways to access raw data from within Microsoft Dynamics CRM in an effort to extract the raw data and to place it in our data warehouse. I started to explore utilizing OData and SSIS to pull the necessary data from the cloud to our local warehouse.

Whilst there are known authentication issues between Dynamics CRM and the Microsoft OData SSIS data source (and thus we could not utilize this access method), I thought it to be so very powerful, that I began looking for other constructive manners in which to utilize the OData Source.

Read more »

Automating your database restores

March 30, 2015 by

Introduction

A few days back I encountered an interesting challenge. The client wanted to have copies of the nightly backups of the transactional databases restored on a warehouse server, to be utilized to update the warehouse.

The over all process

Prior to the pushing the daily backup to the warehouse server, the previous days restore is deleted. The important point being that the “SQLShackFinancial” database is no longer present on the warehouse server. Having been deleted, downloading of the backup file begins and the restore of the current backup version begins. Normal warehouse processing then ensues and so the cycle continues.

Read more »

Which fields do my reports ACTUALLY use!!

March 18, 2015 by

Introduction

Have you ever felt like pulling your hair out, trying to ascertain exactly which fields in your existing Reporting Services datasets are being utilized by your reports. This happened to me recently during a corporate conversion and cleanup exercise for a database migration to the cloud.

The “aha moment came after having presented a paper at the PASS SQL Server Nordic Rally (March 2015), when one attendee came up to me and asked if I knew of a method to do this. As they say ‘necessity is the mother of invention’ and spiking my interest, I played around until I came up with the solution that we are going to chat about today. The end solution may be seen below

Read more »
Page 1 of 3123