Ed Pollack

Searching SQL Server made easy – Building the perfect search script

March 9, 2016 by

The need to search through database schema for specific words or phrases is commonplace for any DBA. The ability to do so quickly, accurately, and completely is critical when developing new features, modifying existing code, or cleaning up the vestiges from an application’s ancient history.

After introducing how to effectively search catalog views in the previous article, here we will brings everything together, merging the many scripts that we previous wrote into a single search tool. While the resulting TSQL is big, it is ultimately the sum of the many building blocks we have already tested successfully. The final script can be copied and used on any server where you’re looking for specific metadata.

Review of Purpose

Before diving further into this search script, it’s worthwhile to take a moment and review why we are doing this, and some of the common use cases for it. Searching SQL Server is very common, and doesn’t just accompany the needs of a curious DBA. The most frequent scenarios where a script like this has been valuable to me include:

  • A software release is altering database schema and I want to verify that no related objects also need to be changed.
  • Code reviewing release scripts that involve the altering or removal of existing schema.
  • Database or object migrations, in which all referencing schema must be updated with the new location.
  • Verifying that an object is unused and can be dropped.
  • A major application upgrade is revamping a significant portion of code, requiring extensive database research.
  • …and of course, the needs of a curious DBA 🙂

There are many other reasons why we would want to search SQL Server for schema or objects, so feel free to add yours to the list!

Bringing it All Together

We’ve explored a wide variety of system views and metadata sources in order to build methods of searching a majority of objects within SQL Server. Running each of those queries one-at-a-time is inconvenient, and our stated goal was to build a single script that would do everything for us. Once complete, we can save this super-script as a SQL file or as a snippet within Management Studio for quick retrieval in the future.

How do we combine so many different data sets into one, and how do we structure our search script to do everything we want it to? It’s helpful when building something big to sketch out our design plan so that we have some direction when we begin to pour forth pages of TSQL. Here is my vision for our search solution:

  1. Define our search string and save it to a parameter for reuse in the future. Define parameters that will control how and what we search for.
  2. Create a temporary table to store all of our collected data as we start to return a variety of result sets.
  3. Create a list of databases that we want to search. By default, I’ll search all databases (system and user).
  4. Collect all server-level objects and insert them into the temp table.
  5. Iterate through all databases listed above and return result sets from each for all database-level objects into our temp table.
  6. Return the final data set, including metadata from both server and database objects.

Since our result sets vary greatly depending on the type of metadata being collected, our temp table will have some extra columns that may or may not be used for each object type. I find this preferable to returning twenty different data sets as the single unified table of results is easier to copy, export, or otherwise manipulate once its returned.

Let’s get started! Our first steps are to define any variables we are going to use throughout the search, create a configuration section, and create our temp table:

We set the transaction isolation level to READ UNCOMMITTED for this proc. While it is unlikely to cause any unwanted contention, there is the chance it could interfere with any other procs out there that happen to be accessing any of these system vies. In the configuration section, we define @search_string, which is the text we will be searching our server for, as well as bits that control whether or not we want to search SSIS or SSRS. In we are searching SSIS packages on disk (@search_SSIS_disk = 1), then be sure to also populate @pkg_directory with a valid folder on disk. Note that wildcards are added to the beginning and end of the search criteria we enter. This ensures that we search for all objects containing the text, and not just those beginning in, or equal to that text. Since the metadata tables we are searching are relatively small, the cost of doing this is acceptable for the functionality we gain.

Our #object_data temp table has columns for everything we could possibly return. This means that for any given search, some columns will be NULL. This is by design, and reduces the confusion that may arise by trying to reuse columns for data that doesn’t quite match their names or definitions. Feel free to modify this table or how the result sets are stored if your environment is better served by a different column list/order/definition.

Lastly, we populate the @database table, which will be used later on when we iterate through each database on the server to collect database metadata.

With the housekeeping out of the way, we can move on to collecting server-level metadata using some of the queries we defined previously. Each search task is broken into a separate insert into #object_data, and only populating whatever columns are needed by it. The following TSQL will insert into our temp table all of the metadata about jobs, databases, logins, linked servers, server triggers, SSRS, and SSIS. As per the control bits we created at the start of this proc, SSRS and SSIS will only be searched if they are set to 1.

The queries used throughout this TSQL are very similar, if not identical to the ones we created previously. Our only significant changes are to insert all results into our temp table, and to search using @search_string, rather than any hard-coded criteria. When this TSQL is complete, we’ll have all server metadata in our temp table, ready to be returned later on when we have collected the rest of our search results.

Our next step is to iterate through each database on our server and check for database metadata matching our search criteria. This is a bit trickier as we cannot utilize the USE operator without also using dynamic SQL. In order to query each database with all of the various searches we defined earlier, we will need to declare a dynamic SQL statement for each database, execute it, and make sure the results of each iteration of the loop end up in our temporary table.

As is the case when converting standard TSQL to dynamic SQL, we will need to replace all single apostrophes with a pair of apostrophes. In an effort to keep this as simple as possible, I have left all TSQL searches from above exactly as they were. The only changes are the aforementioned additional apostrophes, as well as the use of INSERT statements to collect our data for easy retrieval later on. The following is the result of this work:

This is a big chunk of TSQL, but despite looking complex is only the sum of all of the database-specific TSQL we have already written, tested, and reviewed. When this section of our script is complete, #object_data will contain all server metadata as well as all database metadata for all databases on our server. This loop may take a little while to run if you are on a server with a large volume of databases or some very sizeable ones, but it will complete and return the results we are looking for, even if it takes a few minutes.

The only task that is left is to return our data from the temp table for our viewing pleasure and then drop the temp table. This is the easy part:

This script was divided into four parts in order to explain each section, how it works, and why it is written the way it is. Combine all of them and we can run it in its entirety and review the results. To test it, I will search for the word “Department”, and instruct the script to search for SSRS reports, as well as SSIS packages located in MSDB (as shown in the first section of this script, many pages ago). Here are the results that are returned on my local server:

A total of 68 objects were returned, of which a segment can be seen here. I’ve also left off the latter segment of columns, in order to preserve space. This search returned a wide variety of objects: an agent job, tables, columns, a synonym, indexes, index columns, foreign keys, foreign key columns, default constraints, check constraints, and views. It took a total of seven seconds to run on my local server, which is quite good given that my storage is a set of slow SATA spinning disk drives on a USB NAS.

Results can be filtered down by searching using more specific keywords. Some of the most common searches I have performed are for linked servers, stored procedures, functions, or tables, in an effort to determine their usage. If you are unlucky and are trying to figure out if a stored procedure called “proc” is in use, or a table called “t”, then you’ll have more results to scan through in order to find relevant results, if a search like this is even worth it. Hopefully that conundrum won’t befall you often.

Customization

There’s a wide variety of other metadata available about each of the objects discussed in this article that could be included in a schema search. Feel free to take this code and modify it to include whatever additional metadata is required to make your search experience complete. Documentation on each of the views used in this article is available on MSDN and provides more than enough details to make the customization process easy.

I personally like getting all of my search results back in a single result set for convenience, but if you’d prefer to separate them into multiple batches, there is no harm in doing so. This could allow you to collect more metadata for specific object types, if they held particular interest for you.

In addition, there are many objects not searched here, most of which pertain to unique SQL Server features. This may include replication, resource governor, in-memory OLTP, partitioning, and more. If you use these features and want to include them in a universal search proc, adding them is quite easy! Mirror the structure created in this article and add metadata for additional objects into the temp table as additional steps.

If you’d like an exact search instead of the wildcard search, feel free to comment out the addition of the %’s to either end of the search criteria or add a flag for that option so that you can determine at runtime if it’s desired or not. Lastly, the contents of this script can be stuffed into a stored procedure for easy use later, with the configuration section becoming a set of proc parameters. Whether used as a snippet, stored procedure, or ad-hoc script is up to you and whatever is easiest in your environment.

Conclusion

The ability to quickly and easily search SQL Server for object metadata is a common need, and one that can be frustrating to manage manually. The various scripts provided in the earlier half of this article provide simple ways to search for a variety of metadata and can be exceptionally useful when a specific need arises. Many system views are introduced that can be used for schema searches such as this, or for gathering more detailed data on a particular type of object.

In the second part of this article, we put everything together into a single script that can be run in order to return information on many different objects in SQL Server. Feel free to take this script, modify it, and use it to fulfill your own search needs. Tools such as this are time-savers and reduce the amount of busy-work that we are forced to endure on a regular basis. Enjoy!!!

Also, a special thanks goes out to Nick Salvi, a coworker with extensive SSIS/SSRS experience who provided the TSQL that was the basis for searching those parts of SQL Server!


Ed Pollack
Search

About Ed Pollack

Ed has 20 years of experience in database and systems administration, developing a passion for performance optimization, database design, and making things go faster. He has spoken at many SQL Saturdays, 24 Hours of PASS, and PASS Summit. This lead him to organize SQL Saturday Albany, which has become an annual event for New York’s Capital Region. In his free time, Ed enjoys video games, sci-fi & fantasy, traveling, and being as big of a geek as his friends will tolerate. View all posts by Ed Pollack

168 Views