Simple SQL Import Trick

I had a conversation the other day with another developer who was annoyed with a small problem. He had just been handed a block of data that needed to be imported into a SQL database for an application he was developing. It was a one-time load and at about 50,000 rows it wasn’t something he wanted input manually. He received the information from the client a day earlier, as an Excel file [the universal standard for data exchange] and the client also expected to see the data in the system in the next few days. So, my friend was considering whether to do something with SQL Server Integration Services, use the basic SQL Data Import Wizard, or just write a custom console utility to do it. I asked if he had considered just using “copy and paste” as a SQL import – the look I got prompted this article.

In my previous SQL Tip post I talked about how it was pretty easy to override the “edit top 200 row” limit within SQL Management Studio. If you’re not familiar with that technique go read about it now, we’ll be extending that concept slightly here to facilitate this import.

As in the previous tip, we’ll start with the “top 200” edit feature and then modify the T-SQL query to fit our import needs. Specifically, we want to get rid of the row limit and also remove the identity column. Out table is a typical “customer” table with an identity column as its primary key. We want the identity for each of the new rows generated automatically as we add new records so we don’t want that column visible because it won’t be part of the data being pasted in.

SQL Import Setup

Once we’ve made the edit to the T-SQL code and re-executed the query our table will look like this:

Ready for SQL Import by paste

Now, we said earlier that our data is in Excel and we’ll need to open that up next. Here you may have a bit of extra work to do in order to ensure your source data matches your table. For the most part you just want to make sure the columns in your spreadsheet are in the same order as the columns in your destination table. You’ll also want to delete or hide any columns in the spreadsheet that don’t exist in your SQL table. With all that done you’ll just need to select the data in Excel that you’ll copy-and-paste in to SQL. Select the specific cells, not the rows or columns, and if there’s a “header row” with the column names be sure to leave those out. In the case of this example, I selected the first cell at A2 and then Shift+Clicked at E50001 to select all the data. Then type Ctrl+C to copy (or use the toolbar icon if that’s your thing).

Source data for SQL Import

With that down we switch back to SQL Management Studio and select “*” insert row in our query window. It’s important that you select the row, not just the first cell in the row (if you do that, when you paste nothing will appear to happen).

Set destination for SQL Import via paste

With that all in place you can now type Ctrl+V (or select Paste from the Edit menu) and the data will start pasting in. Note that depending on how many records are in your SQL import and how fast your machine is this could take a few minutes. Try it first with a few records to see how it all works. Once the paste has completed you’ll see the records displayed in the query window; pretty cool eh?

Now, no system is perfect of course and you are dealing with external data coming into your SQL database; in other words, you could get errors. You may get errors due to missing data in a non-null field or you may get errors due to a type mismatch. You could also get an error due to your string data being longer than your SQL data field, causing the data to truncate if it was to be accepted. This last one is the most common I’ve seen after the data type problem and the only real fix is to check your column widths (and data types) from your source data to be sure it will match. If you do get an error, you’ll be told which row is causing the problem; that row will be left out of the paste/import and it will continue past that row.

SQL Import Error

 

When I’m doing this sort of thing in real life with a real database I will almost always create an “ImportTemp” table to past this data into rather than pasting directly into a “live” table. This gives me an easy way to recover from any of the mentioned failures [and just truncate the table to try again]. Once I’m happy with the data I’ve pasted into my temp table it’s an easy matter to write a small query that will insert the records from my ImportTemp into my live table. Even with taking two steps like this though it’s still a pretty fast process and I haven’t had to waste time writing a custom tool or building out an integration package just to do a simple SQL import. It’s also a great way for loading up smaller static tables that we often have in our applications – things like a “states” table that just needs to be loaded from a list so it can be referenced in drop-down lists and reports.

I hope you find this this little tip useful and it saves you some time.

Tagged , . Bookmark the permalink.

One Response to Simple SQL Import Trick

  1. Mark Howard says:

    I forgot all about that feature.

    What I have been doing lately is to add a new column to the spreadsheet and then write a function in it like:
    =”INSERT INTO CUSTOMER (Name, Contact, Phone, Email) VALUES (‘” & A2 & ‘, ‘” & B2 & “‘, ‘” & C2 & “‘, ‘” & D2 & “‘)”

    One problem with this approach is if there is an apostrophe in the data. For that, I usually do a Find & Replace in Excel to 2 apostrophes (”).

    I write it in one cell then drag it down the rest of the spreadsheet.

    Then copy the entire column and paste it into SQL Server Management Studio or wherever.

    This is useful for many different scenarios of an import, such as writing a SQL File that scripts the CREATE of a temporary table and then inserting the data and then linking up key fields like StateID or CustomerRegionID or who knows. Then INSERTs into the temp data into the destination table and then DROPs the temp table. Wrap it in a BEGIN and END and it won’t do anything if it fails.

    Helpful when we try to script our changes to the production database.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.