Quickly move data between instances for testing using only PowerShell

Table of Contents

Quick tip, how many times did a dev ask you to move the PROD/QA data to DEV to test a feature? Of course, not the whole DB, which would be too easy via a scripted backup/restore, but only a single table, or subset of table data?

How do you deal with it usually? With the SSMS Import data Wizard? (ugh..) Exporting to flat file and reimporting? Linked Servers? External tools? Restore the whole database in the dev instance and extract the single table? Ignore the dev until it goes away?

The easiest way for me is using Powershell, like so:

### Parameters
$SourceInstance = 'SOURCEINSTANCE'
$SourceDB = 'SOURCEDB'
$SourceSchema = 'dbo'
$SourceTable = 'SOURCETABLE'

$DestinationInstance = 'DESTINATIONINSTANCE'
$DestinationDB = 'DESTINATIONDATABASE'
$DestinationSchema = 'dbo'
$DestinationTable = 'DESTINATIONTABLE'

### Parameters

$DataTable = Read-SqlTableData -TableName $SourceTable -SchemaName $SourceSchema -DatabaseName $SourceDB -ServerInstance $SourceInstance -Verbose

$DataTable | ConvertTo-DbaDataTable | Write-DbaDbTableData -Table $DestinationTable -Schema $DestinationSchema -Database $DestinationDB -SqlInstance $DestinationInstance -Truncate -KeepIdentity -AutoCreateTable -KeepNulls -BatchSize 50000

I’m using the sqlserver and dbatools Powershell modules to accomplish such a tedious task in the fastest way possibile.
The Write-DbaDbTableData cmdlet is pretty neat because it can create automatically the destination table if it doesn’t exists, truncate the table if it exists (or append, your choice), keep the identity values and nulls if necessary and everything is done via a bulk insert with a configurable batch size.

I’m pretty sure that the example above can be used to integrate a fully automated alignment process, give it a try!

Performance Note

I had to include ConvertTo-DbaDataTable in the Write-DbaDbTableData pipeline, because otherwise the cmdlet is not able to do a proper bulk insert, since as opposed to getting all the rows from the $DataTable and sending it in bulk to SQL Server, it fetches only one row at the time and does a lot of bulk inserts for single rows.
If you’re having performance issues with Write-DbaDbTableData keep this piece of information in mind.

comments powered by Disqus

Related Posts

You can now access the SQL Server Diagnostic Book remotely!

UPDATE: This is now somewhat deprecated, because you can install the Book as an extension, without having to type anything, as detailed here .

Read More

Create an asynchronous multithreaded workflow on your SQL Server using the Service Broker

As any other *Server, SQL Server is born with concurrency in mind, a lot of users/applications can connect concurrently and do their job, as designed, perfect, but, what if you’re the only user?

Read More

Scan Performance Showdown: INROW Predicate pushdown VS Batch Mode Adaptive Join vs Columnstore indexes

I was reading Paul White’s post about bitmaps in SQL Server and, other than realizing that I basically know nothing about anything (like a Socrate’s epiphany) I wanted to at least see how this good stuff added up in the latest 7 years.

Read More