Moving large tables between environments

Hello readers!

Couple of days ago I had to move large table (~43 mln rows) from production environment to a dev environment.

Below is a table structure as follows:

CREATE TABLE [mmm].[Music Band](

[BandID] [int] IDENTITY(1,1) NOT NULL,

[BandTypeID] [tinyint] NOT NULL,

[BandName] [nvarchar](255) NOT NULL,

[IsAvailable] [bit] NOT NULL,

[IsRemoved] [tinyint] NOT NULL,

[CreatedDataSourceTypeID] [smallint] NOT NULL,

[CreatedDataSourceID] [int] NOT NULL,

[UpdatedDataSourceTypeID] [smallint] NOT NULL,

[UpdatedDataSourceID] [int] NOT NULL,

[DateCreated] [datetime] NOT NULL DEFAULT (getdate()),

[DateModified] [datetime] NOT NULL DEFAULT (getdate()),

[LastUpdated] [timestamp] NOT NULL

)

My initial approach was to use SSIS package to export data to a text file, then ship text file via ftp to a different server and then load text file via SSIS package to a table on a new server.

This approach failed miserably as it took very long time to debug package failures due to load issues from text file. Firstly, [BandName] can have different text data in it, including special characters like ],[,| %, $ etc since it represents name of the music band. So, no matter which delimiters you use in text file since there 43 mln rows in original table, likelihood of failure is very high. There are other possible reasons for failures, but I will omit them here.

So, what I did instead was to create a separate database, move table definition and all its data to this new database via SSIS (simple package with OLEDB Source and destination in data flow task), backup database, then ftp backup to the destination server, restore database backup on it and then move data to destination database again via SSIS (simple package with OLEDB Source and destination in data flow task), drop intermediate database and voila, done.

 

I hope you find this simple trick useful.

Cheers,

-Yuriy