Monitoring Free Space for SQL Server In-Memory Checkpoint Files

By:   |   Comments   |   Related: > In Memory OLTP


Problem

In order to use In-Memory tables in SQL Server, you have to provide an In-Memory filegroup that points to one or more containers (subdirectories). Books Online recommends the below information with regards to how much free disk space to provide for In-Memory tables:

  • When configuring storage, you must provide free disk space that is four times the size of durable memory-optimized tables.

Books Online further states the following about providing disk space for In-Memory tables:

  • A memory-optimized table can be durable or can be non-durable. You only need to configure storage for durable memory-optimized tables.

Based on this guidance, if I do not use durable tables, I need provide no space at all for the checkpoint files. This guidance turned out to be wrong.

Solution

My first foray into using In-Memory tables was to relieve tempdb contention we were experiencing. Our production processes already used a lot of table types for passing in data via a table-valued parameter. The quickest way for us to introduce in-memory tables was to replace the existing table types with memory-optimized table types.

Books Online says that memory-optimized table types “are memory-optimized user tables, the schema of which is persisted on disk similar to other user tables.” Books Online goes on to say that a SCHEMA_ONLY value for the DURABILITY option “indicates that the table is non-durable.”

Since I am only using memory-optimized table types, which are non-durable, the guidance from Microsoft indicates that I do not need to allocate free disk space for this scenario. Nonetheless, I chose to play it safe and created a 5 GB volume for the In-Memory filegroup.

This advice is patently not correct. In fact, it is so incorrect, that you do not even need to create any In-Memory objects for the checkpoint files to grow and grow until they run out of space.

Recently, the CSS SQL Server Engineers blog posted an article titled Why am I getting so many checkpoint files when I have In-Memory OLTP enabled? Around the same time, someone started a discussion about the exact situation they were experiencing when restoring a database from their production environment to the development environment where regular backups are not performed. In this case, there were no In-Memory objects created yet, but the development server had run out of disk space due to the number of checkpoint files. The CSS Engineers blog post has a simple reproduction defined that you can test for yourself if you want to see it in action.

At the time of the discussion, I had not yet found the blog post from the CSS Engineers, and I had just experienced an issue where the In-Memory checkpoint files ran out of disk space in the middle of the night. It was no accident that it happened in the middle of the night. The issue occurred while the full backup was running. If you back up the log while a full backup is running, it is essentially a copy only backup as the log backup cannot truncate the log while a full backup is running. Despite log backups running every 15 minutes, the log went 30 minutes without being truncated, and that was enough to run out of disk space.

We immediately doubled the size of the file to 10 GB, and then during work hours, I started setting up a maintenance task to check the free disk space and raise an alert if it dropped below a certain threshold.

Monitor Available Disk Space for In-Memory Filegroups

My team already had monitoring in place to check the available space in database files and alert us if any of them exceeded the defined threshold. In-Memory disk usage is different because it is not using space in a defined file. It is using disk space in an ad-hoc manner in a container. Checking free space in a database file is easy. Checking free space in an In-Memory filegroup poses different challenges; however, I can break it down into tasks.

  • Check disk space allocated to the volume for the In-Memory filegroup
  • Measure checkpoint file space usage
  • Send an alert if the space usage exceeds the specified threshold

Check Disk Space Allocated to the Volume for the In-Memory Filegroup

I knew I wanted this process to run as a SQL Server Agent Job so I needed every step to be able to be executed from SQL Server. I immediately discounted using xp_cmdshell and OLE automation (sp_OAxxx) procedures. I’m by no means saying that there are no valid reasons to use those procedures. I made a design choice to use newer technologies instead of older ones. PowerShell is a tried and true method for getting space information about drives. Where I work, we use mountpoints for just about everything inside of SQL Server, and PowerShell handles them quite well.

The challenge here is the SQL Server implementation of PowerShell uses an older version of PowerShell and not everything you would expect to be supported is supported. The common work-around this is to specify a CMDExec step in the SQL Server Agent Job and execute PowerShell.exe and pass in the script to it using the –command parameter. With the command parameter, you can either pass in a PowerShell script as a string or a path to a PowerShell file.

The script turned out to be a little too complex to get to work as a passed-in string. I’m not a PowerShell expert, but I elicited help from experts, and they couldn’t get it to work correctly. So I gave up and went with their alternate suggestion and saved the script to a file and just passed in the file path and name in the command parameter.

The script calls Get-WmiObject (alias gwmi) win32_volume to get the file space usage for all drives and return the drive label (logical drive name), capacity in GB, and caption (path). The tricky part was how to get the data from the script into an object in the database that the script could use.

I created a table in our DBA management database to temporarily hold the drive information and a stored procedure for the script to call to insert the data into the table. Note that our objects are created in a custom schema dbaMGMT. If you plan to use these scripts, be sure to either create the schema or change the schema to the correct one for your environment before trying to use them.

The table schema is below:

Drop Table If Exists dbaMGMT.DriveSpace;
Go
 
Create Table dbaMGMT.DriveSpace (
   Label varchar(255),
   CapacityGB decimal(9,2) null,
   Caption varchar(255));
Go
   

The following procedure is called by the PowerShell script :

Drop Procedure If Exists dbaMgMT.CheckDriveSpace;
Go
 
Create Procedure dbaMgMT.CheckDriveSpace
   @Label varchar(255),
   @CapacityGB decimal(9,2) null,
   @Caption varchar(255)
As
 
Insert Into dbaMGMT.DriveSpace
Select @Label As Label,
   @CapacityGB As CapacityGB,
   @Caption As Caption;
Go
   

So far, everything is very basic. The table and procedure match the PowerShell output from the below script. Please note that the call to invoke-sqlcmd does not specify the server name to which it should connect. The default value is to connect to the local default instance. If you are using a remote SQL Server or a non-default instance, you may need to add the –Servername parameter to the call.

$Files = gwmi win32_volume | select label, @{LABEL='CapacityGB';EXPRESSION={"{0:F2}" -f ($_.capacity/1GB) } }, caption | Where {$_.label -ne $null}
ForEach ($File in $Files) {
                $FileLabel = $File.Label
                $FileCapacityGB = $File.CapacityGB
                $FileCaption = $File.Caption
                $Query = @"
                Exec dbaMgMT.CheckDriveSpace @Label = '$FileLabel',
                                @CapacityGB = '$FileCapacityGB',
                                @Caption = '$FileCaption';
"@
                invoke-sqlcmd -Database DBAManagement -Query $Query
}
   

In the SQL Server Agent Job, I use the following command to execute the PowerShell script:

Powershell.exe -command "& P:\PowerShellScripts\CheckFileSpace\CheckFileSpace.ps1"
   

Now that I know how much disk space is allocated to the In-Memory filegroup container, the next step is to check how much space is being used by In-Memory checkpoint files.

Measure Checkpoint File Space Usage

Fortunately, measuring checkpoint file space usage can be done using the Dynamic Management View (DMV) sys.dm_db_xtp_checkpoint_files. This DMV lists all checkpoint files in the In-Memory filegroup containers (you can only have one filegroup, but multiple containers). It’s a lot of information that isn’t very meaningful in this output. Let’s aggregate the data to get a better idea of our checkpoint file usage.

Here are some queries that will give you better insight. The first query breaks the file information down by file type.

Select CF.container_id As [File ID],
       CF.file_type_desc As [File Type],
       Cast(Round(Sum(file_size_in_bytes)/1024.0/1024/1024, 3) As Decimal(9,3)) As [File Size GB],
       Min(DF.[name]) As [Logical Name],
       Min(DF.physical_name) As [Container]
From sys.dm_db_xtp_checkpoint_files As CF
Inner join sys.database_files As DF On DF.file_id = CF.container_id
Group By DF.[file_id], CF.container_id, CF.file_type_desc;
   

The results run on a test In-Memory table look like:

File ID File Type File Size GB Logical Name Container File ID
65537 DATA 19.125 IMScratch_IM01 c:\MountPoints\IMScratch\data\IMScratch_IM01 65537
65537 DELTA 1.195 IMScratch_IM01 c:\MountPoints\IMScratch\data\IMScratch_IM01 65537
65537 FREE 0.633 IMScratch_IM01 c:\MountPoints\IMScratch\data\IMScratch_IM01 65537
65537 ROOT 1.594 IMScratch_IM01 c:\MountPoints\IMScratch\data\IMScratch_IM01 65537

Perhaps you would rather get all that information in a single row per container rather than multiple rows. If you have several containers, multiple rows per container could be unwieldy. Here is a version of the query that pivots the data.

Select FileID As [File ID],
       [Name] As [Logical Name],
       Left([FileName], Len([FileName]) - CharIndex('\', Reverse([FileName])) + 1) As [Base Container],
       [DATA] + [DELTA] + [FREE] + [ROOT] As [Total Size GB],
       [DATA] + [DELTA] + [ROOT] As [Used Space GB],
       [FREE] As [Free Space GB],
       ([DATA] + [DELTA] + [ROOT]) / ([DATA] + [DELTA] + [FREE] + [ROOT]) As [Percent Used],
       DB_ID() As [Database ID]
From (Select CF.container_id As FileID,
             CF.file_type_desc As FileType,
             Cast(Round(Sum(file_size_in_bytes)/1024.0/1024/1024, 3) As Decimal(9,3)) As FileSizeGB,
             Min(DF.[name]) As [Name],
             Min(DF.physical_name) As [FileName]
      From sys.dm_db_xtp_checkpoint_files As CF
      Inner join sys.database_files As DF On DF.file_id = CF.container_id
      Group By DF.[file_id], CF.container_id, CF.file_type_desc) As Base
Pivot (Sum(FileSizeGB)
       For FileType In ([DATA], [DELTA], [FREE], [ROOT])) As Pvt;
   

The output for this query would look like the following:

File ID Logical Name Base Container Total Size GB Used Space GB Free Space GB Percent Used Database ID
65537 IMScratch_IM01 c:\MountPoints\IMScratch\data\ 22.547 21.914 0.633 0.971925 9

For the purposes of monitoring, I am not concerned with the different types of checkpoint files. I only need to see the space usage by all of the checkpoint files.

Select CF.container_id As [File ID],
       Cast(Round(Sum(file_size_in_bytes)/1024.0/1024/1024, 3) As Decimal(9,3)) As [File Size GB],
       Min(DF.[name]) As [Logical Name],
       Min(DF.physical_name) As [Container]
From sys.dm_db_xtp_checkpoint_files As CF
Inner join sys.database_files As DF On DF.file_id = CF.container_id
Group By DF.[file_id], CF.container_id;
   

This is the expected output for this query:

File ID File Size GB Logical Name Container
65537 22.547 IMScratch_IM01 c:\MountPoints\IMScratch\data\IMScratch_IM01

I plan to call the query for each database that has an In-Memory filegroup from the DBA management database. I will need to build the query dynamically passing in the database name to the query and executing it. That looks like the below script:

Set @SQL = N'Select N''' + @DBName + N''' As DBName,
   Convert(Decimal(9,3), Round(Sum(file_size_in_bytes)/1024.0/1024/1024,3)) As SizeGB,
   Left(DF.physical_name, Len(DF.physical_name) - CharIndex(''\'', Reverse(DF.physical_name)) + 1) As FileNameAndPath
From ' + QuoteName(@DBName) + N'.sys.dm_db_xtp_checkpoint_files As CF
   Inner join ' + QuoteName(@DBName) + N'.sys.database_files As DF On DF.file_id = CF.container_id
   Group By DF.physical_name;';
   

I will incorporate this script into the final step where I check the file space used against the size of the volume where the container is located and alert if it exceeds the defined threshold.

Send an Alert if the Space Usage Exceeds the Specified Threshold

The final step of this process is to use the checkpoint file space usage data and the volume size data to determine if the defined threshold is exceeded. This involves building a table in HTML and then sending it out using Database Mail. The key aspects of this are setting the email table header, running a query to generate the table body, and then set the table footer. And of course, then the mail is ready to be sent.

Table header

-- Message body header
SET @Body = N'
<table border="1" style="font-family:arial">
    <tr>
        <th style="white-space:nowrap;text-align:right">Database</th>
        <th style="white-space:nowrap;text-align:left">File Name and Path</th>
        <th style="white-space:nowrap;text-align:left">Free Space %</th>
    </tr>';
   

Table content

-- Message body content
SELECT @Body = @Body + '
   <tr>
      <td style="white-space:nowrap;text-align:right">' + DBName + '</td>
      <td style="white-space:nowrap;text-align:left">' + FileNameAndPath + '</td>
      <td style="white-space:nowrap;text-align:left">' + Cast(SizePercent As varchar) + '</td>
   </tr>'
From (Select FS.DBName, FS.SizeGB, DV.CapacityGB, FileNameAndPath,
      SizePercent = 100 - Convert(Decimal(9,3), Round(SizeGB/DV.CapacityGB,3))
   From @FileSizes As FS
   Inner Join dbaMGMT.DriveSpace As DV On DV.Caption = FS.FileNameAndPath) As dt
Where SizePercent <= @Threshold;
   

Table footer

-- Message body close
Set @Body = @Body + '
   </table></html>';
   

Send the mail

EXEC msdb.dbo.sp_send_dbmail
   @profile_name  = @profileName,
   @recipients    = @Recipients,
   @subject       = @subject,
   @body          = @Body,
   @body_format   = 'html',
   @importance    = 'Normal',
   @from_address  = '[email protected]';
   
Next Steps

The full deployment script is attached and contains all code needed to create the tracking table, procedures, and SQL Server Agent Job. The PowerShell script is also included. As with any scripts downloaded from the internet, you should inspect the code before executing it and test it out in a non-production environment before you deploy it to production.

There are a few things other than running the script you will need to do to deploy this to your environment. The order of the following steps is not important, but these steps will help you have a successful deployment of this process.

  • Make sure that Database Mail is enabled and configured with a profile that you can use in the script.
  • Modify the scripts to reflect your environment including the database the script is in, FROM and TO email addresses, and the path you use for the PowerShell script.
  • Customize the threshold configuration in the job if you don’t want to use the default setting (20%).
  • Adjust the job schedule if you want it to run more or less often than what is specified in the scripted schedule (every 5 minutes).
  • SQL Server Agent Job script will configure the job to send an alert to an Operator named DBATeam if the job fails. Either create this operator to send to your team or change the Operator to the name of an existing Operator.

Happy hunting (for drive space alerts)!



sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Robert L. Davis Robert Davis is a SQL Server Certified Master, MVP, and experienced DBA, evangelist, speaker, writer, and trainer.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article

















get free sql tips
agree to terms