Non-Standard Delimiters for Columns and Rows Using SQL Server Bulk Insert

By:   |   Updated: 2018-02-19   |   Comments (1)   |   Related: More > Import and Export


We池e new to the bulk insert command in T-SQL and we have files that specify new rows and columns without using a common structure, like commas or vertical bars. We also sometimes need to insert a bulk of data from a file without specifying any row or column delimiters. Can we use this command in these cases, or should be come up with a dynamic direct insert by marking the data around apostrophes?


The bulk insert command in SQL Server allows us to configure how we want to parse data from files to fit with our data schema. It also provides us with tools that we can use to skip a first or last row, in case we receive files with output on the file one or two lines that痴 meaningless for us to use or that does not fit our data structure.

In this tip, we値l look at a few examples with data from files that we値l seldom see and provide a clear example of how useful this tool can be in SQL Server for obtaining data from data delineated files.

Text Files for Input

We値l create three text files that we will be inserting into a table.

In the first file, we値l create three lines with the below information.  This will be saved as tsp1.txt.


In the second file, we値l create the file with the below one line.  This will be saved as tsp2.txt.,,

Finally, we値l create the third file with a load of information in the form of a sentence posted together at least ten times (you can copy and paste more if you want to test more data). This will be saved as tsp3.txt.

The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. The quick brown fox jumped over the lazy dogs. (10)

SQL Tables for Data Storage

Now, we値l create two tables the first table we値l use on the first two files and the final table we値l use on the last file. The first table will have five tiny integer columns, matching the two files with five integers separated by different characters. These files have different ways of separating new rows one file separates the new rows by a new line character while the other file separates the rows by a comma. In the first table, each column is named with Column and its ordered number. In the second table, we only have one column with a maximum number of varchar characters allowed.

	ColumnOne TINYINT,
	ColumnTwo TINYINT,
	ColumnThree TINYINT,
	ColumnFour TINYINT,
	ColumnFive TINYINT


Using Bulk Insert

With these tables created, we will now insert the first file's data into the first table with five tiny integer columns. We specify that the column terminator is a t and the row terminator is a new line character (0x0a is the hexadecimal for new line character).

FROM 'E:\ETLFiles\tsp1.txt'
     ,ROWTERMINATOR = '0x0a'

FROM tbIns1
bulk insert

We see that the bulk insert code inserted the correct number of rows and columns of data five columns of tiny integers with three rows. Without specifying the first row of data, T-SQL will default the insert to the first row. We値l run the below code, truncating the table first, then specifying the starting row only as an example note how it skips the first row. This will be useful if we have character separated files which have headers in the first row.


FROM 'E:\ETLFiles\tsp1.txt'
     ,ROWTERMINATOR = '0x0a'

FROM tbIns1
truncate table

Without specifying the first row, T-SQL will default to inserting everything from the file. This is important because we can specify both the first and last row with files that may have headers or end of file lines with data we want to skip (they may throw an error if they don稚 match our data schema).

In the next example, we値l truncate our table and insert the second file. In the second file, we do not specify new rows by new lines, each row of data is terminated by a comma (thus a comma is separating the rows), whereas new columns appear after a period. While these type of files are rare, we will run into these files sometimes where a new line is not the row terminator (the end of the row).


FROM 'E:\ETLFiles\tsp2.txt'

FROM tbIns1
field terminator

In the final example, we値l insert the ten sentences into a table with a column of maximum varchar characters. Notice that we don稚 specify any delimiter, as we want to insert all the data into one column. If we have a delimiter, like the above examples, we would specify it. In this case, we want all data in one row, and we don稚 specify anything. All data are inserted into the row:

FROM 'E:\ETLFiles\tsp3.txt'

FROM tbIns2
bulk insert

When inserting data and thinking about what separates columns and rows, it痴 important to remember that with bulk insert the option fieldterminator specifies what separates the columns (also known as fields). The rowterminator separates the rows. In the third example, we didn稚 have any rows separate in rare cases, we may insert a large selection of data that we intend to parse, or that is designed for output through an application in its form (think of a disclaimer or block of html code).

Next Steps

Last Updated: 2018-02-19

get scripts

next tip button

About the author
MSSQLTips author Tim Smith Tim Smith works as a DBA and developer and also teaches Automating ETL on Udemy.

View all my tips

More SQL Server Solutions

Post a comment or let the author know this tip helped.

All comments are reviewed, so stay on subject or we may delete your comment. Note: your email address is not published. Required fields are marked with an asterisk (*).

Email me updates

Signup for our newsletter

I agree by submitting my data to receive communications, account updates and/or special offers about SQL Server from MSSQLTips and/or its Sponsors. I have read the privacy statement and understand I may unsubscribe at any time.

Tuesday, February 20, 2018 - 5:04:36 PM - Peter Whyte Back To Top

Nice post!

I was getting a bulk load data conversion error on column 5 when running the first imports, had to remove the 't' at the end of each 3 lines. 




Recommended Reading

Export SQL Server Tables to Excel

Simple Image Import and Export Using T-SQL for SQL Server

Bulk Insert Data into SQL Server

Using OPENROWSET to read large files into SQL Server

Different Ways to Import JSON Files into SQL Server

get free sql tips
agree to terms

Learn more about SQL Server tools