One task that most people are faced with at some point in time is the need to import data into SQL Server from an Excel spreadsheet. We have talked about different approaches to doing this in previous tips using OPENROWSET, OPENQUERY, Link Servers, etc... These options are great, but they may not necessarily give you as much control as you may need during the import process.
Another approach to doing this is using SQL Server Integration Services (SSIS). With SSIS you can import different types of data as well as apply other logic during the importing process. One problem though that I have faced with importing data from Excel into a SQL Server table is the issue of having to convert data types from Unicode to non-Unicode. SSIS treats data in an Excel file as Unicode, but my database tables are defined as non-Unicode, because I don't have the need to store other code sets and therefore I don't want to waste additional storage space. Is there any simple way to do this in SSIS?
If you have used SSIS to import Excel data into SQL Server you may have run into the issue of having to convert data from Unicode to non-Unicode. By default Excel data is treated as Unicode and also by default when you create new tables SQL Server will make your character type columns Unicode as well (nchar, nvarchar,etc...) If you don't have the need to store Unicode data, you probably always use non-Unicode datatypes such as char and varchar when creating your tables, so what is the easiest way to import my Excel data into non-Unicode columns?
The following shows two different examples of importing data from Excel into SQL Server. The first example uses Unicode datatypes and the second does not.
If we map the columns firstname to firstname and lastname to lastname we automatically get the following error in the OLE DB Destination.
Columns "firstname" and "firstname" cannot convert between unicode and non-unicode data types...
If we execute the task we get the following error dialog box which gives us additional information.
Solving the Problem
So based on the error we need to convert the data types so they are the same types.
If you right click on the OLE Destination and select "Show Advanced Editor" you have the option of changing the DataType from string [DT_STR] to Unicode string [DT_WSTR]. But once you click on OK it looks like the changed was saved, but if you open the editor again the change is gone and back to the original value. This makes sense since you can not change the data type in the actual table.
If you right click on the Excel Source and select "Show Advanced Editor" you have the option of changing the DataType from Unicode string [DT_WSTR] to string [DT_STR] and the change is saved.
If you click OK the change is saved, but now you get the error in the Excel Source that you can not convert between unicode and non-unicode as shown below. So this did not solve the problem either.
Using the Data Conversion Task
So to get around this problem we have to also use a Data Conversion task. This will allow us to convert data types so we can get the import completed. The following picture shows the "Data Conversion" task in between the Excel Source and the OLE DB Destination.
If you right click on "Data Conversion" and select properties you will get a dialog box such as the following. In here we created an Output Alias for each column.
Our firstname column becomes firstname_nu (this could be any name you want) and we are making the output be a non-unicode string. In addition we do the same thing for the lastname column.
If we save this and change the mapping as shown to use our new output columns and then execute the task we can see that the import was successful.
As you can see this is pretty simple to do once you know that you need to use the Data Conversion task to convert the data types.
Next time you are importing data into SQL Server, don't forget about using the Data Conversion task if you are importing unicode data types into non-unicode columns
If you encounter this error Columns "xx" and "xx" cannot convert between unicode and non-unicode data types...remember this tip
4. Also as a final step, I checked the source file for columns which are longer than what SSIS tell us they are. And I was so mad when I found that couple of columns were beyong size 150 while SSIS picked it up as 50. I used LTRIM,RTIM and Substring to fit the data into the destination tables.
Hope this solution helps you guys out as well.
Friday, December 06, 2013 - 10:32:16 AM - Alexander
I cannot alter the destination to Unicode as per the client's requirements.
New Attempt: I used derived column to explicitely convert the datatypes to match the column datatypes in the destination. The good news is the string datatypes have been taken care of but the unicode version of date,int(which are also the destination data types) are having issues.
Monday, November 25, 2013 - 3:48:56 PM - Greg Robidoux
I have 2 columns from Access Data base which has data types as memo . For one of the columns i converted from DT_NTEXT to UniCode String[DT_WSTR] it works fine when i tried to convert the other column its throwing me an error .
[Data Conversion ] Error: Data conversion failed while converting column "SERVICES" (1234) to column "Copy of SERVICES" (1267). The conversion returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
Anything missing during conversion ?
Tuesday, July 16, 2013 - 10:20:30 PM - Frank Djabanor
Excellent tip!! Will definitly keep this in the back of my mind when moving data from Excel spreadsheets to SQL Server. I am relatively new to SQL, and I am amazed at the wealth of information, and the willingness to share good info. I will pay it forward!
Thanks fo rthe tip. I was encountring similar error in SSIS 05 transferring data from Excel to SQL. I have used Derived Column for the conversion. Now I am gettting two columns in the output one unicode and one none unicode but it worked like a charm. Thank you.
Please help me. I have situation where I use Oracle source to SQL server destination in SSIS Everything works fine and I get data from oracle to sql destination without any problem. I have "Invoice date" column where I use to_char(invoice_date, DDMMYYYY') in the oracle command and it stores as british format in sql table. But I wanted the format to be in MMDDYYYY to store in SQL table. I have tried all the data conversion tasks to convert it but nothing works for me. Do I need to use derived columns in SSIS? If so can you please let me know with examples.
Hi Data conversion works get, but it slows the process rather letting the SSIS create the table (just to get the datatype accurate) would be great and you can finally push the data into the final table in from that staging table in the format you like, i have noticed this issue while bring database from non-sql server environment like Oracle or DB2. Hope this helps.
Hi I am not able to laod the data more than 4000 charecters to the SQL database using SSIS pacakage . I am getting the following issue. Could somebody help me here.
Error Message:[XML Source ] Error: The "component "XML Source" (3245)" failed because truncation occurred, and the truncation row disposition on "output column "Actions" (6814)" at "output "StepDetails" (3254)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
I have defined the Actions column as Varchar(6000) on database and I have chnaged the External columns length on XML advanced editor to Unicode string (WT_STR) 4000. Please correct me for any changes.
So, I'm converted a table from varchar to nvarchar etc.. So, I modifed the SSIS package which is trying to export from SQL 2008 to a non-unicode file and I'm getting the dreaded cannot convert between unicode and non-unicode string data types. I deleted the original green lines and removed the mappings. I added a data conversion object and remapped the fields. There's no errors on any of the objects. But when it runs I get the error.
Tuesday, January 01, 2013 - 9:58:01 PM - Budisanto
Am facing problem. Am trying to export data from excel file to my data base table. Am want to transfer datato database table not as excel file format. Users are posting data in excel . Here example for my excel file format .
@Greg, I just got into work this morning and have been informed that my admin rights are not full so IT in the company I work for is working on making sure that I have full admin rights. I’ve already tried using the import wizard but was stymied there too. It must come down to admin rights because this should be a simple one time data dump as you said. Thanks for your help and patience with this matter, I really appreciate it!
Wednesday, July 25, 2012 - 5:57:34 PM - Greg Robidoux
@Greg – All I am trying to do at this time is take a csv file and populate a table in a database. I used Excel to fill in the fields and then I saved as csv format. I’m doing this so that I don’t have to manually populate this one particular table in the database. There will be 2 more tables that are populated this way before I can start testing full database loads.
As for what tasks for this, I am using a Data Flow task. In the data flow, I have a Flat File Source, the Data Conversion as per discussed on this webpage, and the OLE DB Destination which is the database.
Wednesday, July 25, 2012 - 3:32:22 PM - Greg Robidoux
Thank you very much for your quick response Greg, it's good to have people like you on the interwebs helping novices like me. Your suggestions cleared up the build errors I was receiving but now run-time has gone all hoowee. Instead of showing the complete output, I'll show one warning and 3 errors:
Warning: 0x80049304 at Data Flow Task, SSIS.Pipeline: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.
Error: 0xC0202009 at Data Flow Task, OLE DB Destination : SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
Error: 0xC0202025 at Data Flow Task, OLE DB Destination : Cannot create an OLE DB accessor. Verify that the column metadata is valid.
Error: 0xC004701A at Data Flow Task, SSIS.Pipeline: component "OLE DB Destination" (9) failed the pre-execute phase and returned error code 0xC0202025.
I’m not experienced enough to know how to deal with these and am lost. Thanks for your time.
Tuesday, July 24, 2012 - 11:58:34 AM - Greg Robidoux
I followed this data conversion step by step but am still receiving the same errors. The table I am currently trying to load with a csv file contains an int, 2 nvarchar(max), and a date. I don't see nvarchar(max) as a type of option in the data conversion editor so I was hoping that using string[DT_STR] would suffice but it does not. Also, to match my int in the table, I am using single-byte signed integer [DT_I1], is that right? Finally, does date [DT_DATE] match the date properties of the database which just says date? I noticed there are other date types to choose from in the editor but I am unsure as to which is proper.
Saturday, July 21, 2012 - 10:22:59 AM - Bhanu Prakash
As a side note, I have had the pleasure of playing with some of the new SQL Server 2012 features and they have made some good advances with Excel / SQL integrations. If I find some time, I will put up a tip on it. Thanks Greg! Best, Mike
"Friday, September 11, 2009 - 5:45:20 AM - joeording Note: On a 64-Bit Windows server the registry keys will be found here: HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\4.0\Engines\Excel"
Thanks to joeording for making this note. I spent all day trying to fix this, even going so far as to import the Jet 4.0 registry and afterwards editing the TypeGuessRows key. Turns out I needed to do this in the 64-bit reg key and now no NULLS on mixed data.
In the SSIS Excel Source Connection properties you need to modify the connection string like this, adding the IMEX=1 to enfoce the mixed types conversion. Then change the registry keys to tell it to look through all the records to see if there are mixed data types. The default is to look at the first 8 rows.
So based on the error we need to convert the data types so they are the same types.
"If you right click on the OLE Destination and select "Show Advanced
Editor" you have the option of changing the DataType from string
[DT_STR] to Unicode string [DT_WSTR]. But once you click on OK it
looks like the changed was saved, but if you open the editor again the
change is gone and back to the original value. This makes sense since
you can not change the data type in the actual table."
Here is a catch. you can save this configuration.Go to the Destination task properties and change "ValidateExternalMetaData" to false.You are done. it was working for me.
I was fighting with this issue for 4 hours until I see this post and the solution worked. Even though I had to figure out some screenshots since it is not elaborated much, I managed to leverage the information provided. Thanks a lot and hope your site will provide more solutions like this.
I read your tip, looks like very useful. But I am currently fighting with code page issues. I have a dbf file that I suspect been in different codepage than my database. Source connection cannot catch codepage, so I set AlwaysUseDefaultCodepage to True and set Codepage (I found this somewhere), added Data conversion and set codepage of the destination. But I am rewarded only with error, that conversion from source code page (852) to destination (1250 - czech btw) is not supported. Do you have tip somewhere how to solve this?
I am completely lost, I tried various approaches, searched tons of articles, still no luck :(