Automate Copying Data Across Azure SQL Database Across Servers.


Well, we all have been in that tight spot where we want to develop our solutions with the most accurate data and use cases. BUT! always hesitant to run the Code directly on Production server. or there have been events when we just want to replicate the Data from Database D1 from Server S1 to Database D2 from Server S2.

Data Copy Scenario.

Microsoft did solve this problem Out of the box, see here.
Now, i have access to both the S1 and S2, but the above mentioned documentation by Microsoft states that “Log in to the master database of the target server where the new database is to be created. Use a login that has the same name and password as the database owner of the source database on the source server. The login on the target server must also be a member of the dbmanager role, or be the server administrator login.“(ref here)
Bleh! Too much to do.

So, i came up with this script that uses BCP Utility (Bulk Copy Program utility). Why? Dealing with #bigData made easy!

Prerequisites:

  • Username, Password, Server for Source
  • Username, Password, Server for Target
  • Both Databases having same Schema for all the tables
  • Database Name (I considered same but you have these different, tweak accordingly) (Update: Have added revised script with different DBNames at the bottom of this blog)
  • Table names for which you have to copy data
    Use "SELECT * FROM {database-name}.INFORMATION_SCHEMA.TABLES; GO" to get the tablenames

Script for Copying data from all the tables mentioned.

$SourceServer = "tcp:servername-s1.database.windows.net"
$SourceUserName = "username-u1@servername-s1"
$SourcePassword = "password-p1"
$TargetServer = "tcp:servername-s2.database.windows.net"
$TargetUserName = "username-u2@servername-s2"
$TargetPassword = "password-p2"
$database = "database-name" # Kept same for simplicity.
$bckDirectory = "D:\DataMigration\"
$tableNames = @(
'Table1',
'Table2',
'Table3', ...
) #Use "SELECT * FROM {database-name}.INFORMATION_SCHEMA.TABLES; GO" to get the tablenames
foreach ($table in $tableNames) {
$invocationName = $database + ".dbo." + $table
$datFilePath = $bckDirectory + $table + ".dat"
#Get Data
Write-Host "Getting data for " $invocationName "output file at " $datFilePath -ForegroundColor Yellow
bcp $invocationName out $datFilePath -c -U $SourceUserName -S $SourceServer -P $SourcePassword
#Insert Data
Write-Host "Inserting data into " $invocationName "from source file at " $datFilePath -ForegroundColor Yellow
bcp $invocationName in $datFilePath -c -U $TargetUserName -S $TargetServer -P $TargetPassword
}

So what’s what in those bcp commands:

  • database_name.schema.table_name: The database_name specifies the database in which the specified table or view resides. If not specified, this is the default database for the user. table_name is the name of the destination table when importing data into SQL Azure server (in), and the source table when exporting data from SQL Azure server (out).
  • in: copies from a file into the database table or view.
  • out: copies from the database table or view to a file. If you specify an existing file, the file is overwritten.
  • -c: Performs the operation using a character data type.
  • -U: Specifies the login ID used to connect to SQL Azure server. You must append the SQL Azure server name to the login name in the connection string by using the <login>@<servername> notation.
  • -S: Specifies the SQL Azure server to which to connect. The fully qualified name of the SQL Azure server is servername.database.windows.net.
  • -P: Specifies the password for login ID.

BONUS! Now I happen to have additional use case where i want to clean up all the data in all the tables and start from scratch, Azure SQL does not have the store procedure “sys.sp_msforeachtable” to cover up for this i came up with this SQL Query.

DECLARE @SqlCmd NVARCHAR(MAX)
SELECT @SqlCmd = COALESCE(@SqlCmd + CHAR(10), '') + N'TRUNCATE TABLE ' + QUOTENAME(s.name) + '.' + QUOTENAME(t.name) + ';'
FROM sys.tables t
INNER JOIN sys.schemas s ON s.schema_id = t.schema_id
WHERE t.[name] <> 'sysdiagrams'
AND t.is_ms_shipped = 0
PRINT @SqlCmd
EXEC sp_executesql @SqlCmd

Consider running this script standalone or Automate the coupled Scripts incase you want to restore or refresh the data after a particular schedule or event.


UPDATE:

$SourceServer = "tcp:servername-s1.database.windows.net"
$SourceUserName = "username-u1@servername-s1"
$SourcePassword = "password-p1"
$TargetServer = "tcp:servername-s2.database.windows.net"
$TargetUserName = "username-u2@servername-s2"
$TargetPassword = "password-p2"
$sourcedatabase = "source-database-name"
$targetdatabase = "target-database-name"
$bckDirectory = "D:\DataMigration\"
$tableNames = @(
'Table1',
'Table2',
'Table3', ...
) #Use "SELECT * FROM {database-name}.INFORMATION_SCHEMA.TABLES; GO" to get the tablenames
foreach ($table in $tableNames) {
$datFilePath = $bckDirectory + $table + ".dat"
$sourceinvocationName = $sourcedatabase + ".dbo." + $table
#Get Data
Write-Host "Geting data for " $sourceinvocationName "output file at " $datFilePath -ForegroundColor Yellow
bcp $sourceinvocationName out $datFilePath -c -U $SourceUserName -S $SourceServer -P $SourcePassword
$targetinvocationName = $targetdatabase + ".dbo." + $table
#Insert Data
Write-Host "Inserting data into " $targetinvocationName "from source file at " $datFilePath -ForegroundColor Yellow
bcp $targetinvocationName in $datFilePath -c -U $TargetUserName -S $TargetServer -P $TargetPassword
}

Pingback for assistance, your Feedback’s are always a welcome… 🙂

Regards,
Aditya Deshpande

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s