Executing large scripts in SQL Server Management Studio with Insufficient Memory Failures

By:   |   Comments (12)   |   Related: > SQL Server Management Studio


Have you ever faced a situation when executing a large script file in SQL Server Management Studio (SSMS) that raises this error - "Cannot execute script. Insufficient memory to continue the execution of the program. (mscorlib)" ?

The problem is that SSMS is unable to handle large script files and this problem exists even in SQL Server 2014. In this tip we will describe how to overcome this problem using the SQLCMD utility from the command line instead of SSMS.


Let's illustrate a situation when we are trying to execute a large T-SQL script in SQL Server Management Studio (SSMS) and the above mentioned issue occurs.

First we will create a TestDB database and TestTable:

USE master

--Creating the database


--Creating the TestTable
	Value INT

SSMS Fails to Execute Large Scripts

Now we will execute a T-SQL script file in Management Studio to load data into the TestTable table. In our experiment I used SQL Server 2014 SP2 and the size of script file is 55 MB. After executing the script we receive this error:

T-SQL script

This is the situation when file is so large for SSMS to handle that an error arises.

Using SQLCMD to run large T-SQL Scripts

As an alternate to SSMS, we will use the SQLCMD utility from the command line to execute this script. SQLCMD is a utility which allows us to execute T-SQL statements, script files, stored procedures from the command line. Note that unlike SSMS, which uses the Microsoft .NET Framework SqlClient for execution, SQLCMD being run from the command line uses the ODBC driver.

To specify an input file and run it on our instance we use the following syntax:

sqlcmd -S \ -i 

Suppose that our server name is DBSERVER, instance name is TESTINSTANCE and file name is "InsertData.sql" which is located on the "D" drive. Our command will be the following:

sqlcmd -S DBSERVER\TESTINSTANCE -i "D:/InsertData.sql"

Pressing "Enter" executes this file and after completion we can see the script successfully completes and the table is filled:


We can check in SSMS that table contains the data:

SELECT COUNT (*) AS RowsCount FROM TestTable	


SQL Server Management Studio is unusable for executing large script files. However, sometimes we need to handle large files (for example loading a large amount of data, executing create database scripts generated from large databases and so on) and by using the SQLCMD utility from the command prompt allows us to solve these issues.

Next Steps

Read more about SQLCMD:

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Sergey Gigoyan Sergey Gigoyan is a database professional with more than 10 years of experience, with a focus on database design, development, performance tuning, optimization, high availability, BI and DW design.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Friday, November 22, 2019 - 12:19:09 PM - Wim Cossement Back To Top (83167)

Thanks for the tip!
I had an uncompressed SQL file that was about 1.8 GB big exported from a superior version. I do however have 16 GB RAM so I was a bit surprised about this warning.
SSMS 17.9 here. It's a decent free product but some bugs are annoying!
And the latest version on my Windows 10 has some bugs as well.

Friday, August 31, 2018 - 1:19:25 AM - Suresh Back To Top (77345)

†Why its not asking User Name and password? is it a loophole ?

Sunday, January 31, 2016 - 1:14:25 PM - Aaron Bertrand Back To Top (40557)

Brendt, that syntax was introduced in SQL Server 2008, not SQL Server 2012.


Saturday, January 30, 2016 - 3:58:53 PM - Sergey Gigoyan Back To Top (40555)


You can do that by using "-d" command-line option:

sqlcmd -d <YourDatabaseName> ....


Saturday, January 30, 2016 - 3:51:27 PM - Sergey Gigoyan Back To Top (40554)

Thank you for reading and comments,



Agree with Jim, putting  'go' command after every 1000-th line does not solve the problem, the same error appears (tested on SQL Server 2014 SP2, file size - 112 MB).


Saturday, January 30, 2016 - 6:50:45 AM - Muthukumar Back To Top (40553)

Hi Sergey Gigoyan,

Good Article. Here is my question, How to choose the required DB name to execute my script through SQLCMD ?


Friday, January 29, 2016 - 5:12:35 PM - Aaron Bertrand Back To Top (40549)


Nice, but I would recommend SET NOCOUNT ON in your .sql script, surely all of those 1 row(s) affected in the sqlcmd output window are not very useful. :-)

Friday, January 29, 2016 - 2:08:50 PM - jeff_yao Back To Top (40547)

StackOverflow (SO) has better and mroe comprehensive suggestions regarding this issue:


Friday, January 29, 2016 - 12:55:24 PM - Brendt Hess Back To Top (40546)

 For the specific case given, using the enhanced SQL Server 2012+ VALUES statement will help significantly:

INSERT INTO TestTable (ID, Value) VALUES (1,1), (2,2), (3,3), .... (N, N)

You can also use UNION statements to reduce your overhead:


In both cases, it is still possible to overrun the memory limits, but both of these forms are substantially less memory intensive than the default method of multiple INSERT statements.

Nonetheless, good, consise solution.

Friday, January 29, 2016 - 10:29:13 AM - Jim Back To Top (40544)


Anon: GO won't make a difference. 


Friday, January 29, 2016 - 8:09:05 AM - Junior Galv„o - MVP Back To Top (40542)



Great post, simple and didatic.




Friday, January 29, 2016 - 1:40:38 AM - Anon Back To Top (40536)

Another way: put `go` command every 1000 line.

get free sql tips
agree to terms