Learn more about SQL Server tools

mssqltips logo
 

Tutorials          DBA          Dev          BI          Career          Categories          Webcasts          Whitepapers          Today's Tip          Join

Tutorials      DBA      Dev      BI      Categories      Webcasts

DBA    Dev    BI    Categories

 

Estimating Data Compression ratios for all


By:   |   Last Updated: 2008-09-22   |   Comments   |   Related Tips: More > Compression

One of my favorite features with SQL 2008 has been Data and Backup compression (which I discuss in more detail technically here) - this is not only because of the actual functionality it brings to the table, but also because of all the technical intricacies that it involves and the impact it can have on many other fun topics (fragmentation, storage, internals, etc.). Of course, the functionality is pretty cool too...

One customer of mine was asking how they can get an idea of the level of compression the different flavors of data compression would have on all the different structures within their database - of course, most folks realize the system procedure sp_estimate_data_compression_savings that exists to provide just that - but, this customer wanted to be able to see this type of information for all structures within their database (partitions, indexes, heaps, etc.) and see where they would get the biggest bang for their buck so-to-speak.

So, I went to work putting together a fairly simple procedure that would basically run through a database and execute that for each partition for each type of compression that each given partition wasn't currently set in (i.e., if the partition is NONE compressed, we want to see estimations for ROW and PAGE compression; however, if the partition is already ROW compressed, show estimations for NONE and PAGE compression (or un-compression in the case of NONE)). We also wanted to be able to filter on specific objects and/or thresholds for the minimum size of partition to bother checking.

What came out was sp_estimate_data_compression_savings_all, and I figured we may as well be nice and share with everyone. There's no rocket science here or anything, but a pretty cool procedure nonetheless. Of course, we wouldn't recommend you run this on large production systems during peak hours or anything like that, but it is perfectly well suited for scanning on non-production systems to figure out where to concentrate your time in further investigation.

And, as a final side note, it also includes some of the simple TSQL enhancements that only work with SQL 2008 (compound assignment, inline initialization, etc.) that I usually exclude from my system procedures for backward-compatibility, but since this applies to only 2008 anyhow, I could use them - makes for much cleaner, more easy to write code, that's for sure...

Enjoy!



Last Updated: 2008-09-22


next webcast button


next tip button



About the author
MSSQLTips author Chad Boyd Chad Boyd is an Architect, Administrator and Developer with technologies such as SQL Server, .NET, and Windows Server.

View all my tips
Related Resources




Post a comment or let the author know this tip helped.

All comments are reviewed, so stay on subject or we may delete your comment. Note: your email address is not published. Required fields are marked with an asterisk (*).

*Name    *Email    Email me updates 


Signup for our newsletter
 I agree by submitting my data to receive communications, account updates and/or special offers about SQL Server from MSSQLTips and/or its Sponsors. I have read the privacy statement and understand I may unsubscribe at any time.



    



Learn more about SQL Server tools