We often help customers migrate their applications to the cloud. For small applications with small databases this can be pretty straight forward, however as the complexity and volume of data that the application deals with grows you need to do a little bit more planning.
With Azure SQL Database the principal way of importing data is via a *.bacpac file which contains all of the database definition and data. You can upload this file into an Azure Storage Account and then trigger the import via the web-based UI upon creation of the database.
The size and shape of the data you are importing, combined with the service tier that you chose can have a dramatic effect on import time. If you are planning a time critical cut-over then you are going to want to test how long this migration will take in advance, and if necessary step up to a higher performance tier during import and then move down to an appropriate tier for your usual transactional workloads.
As you can see from the following chart, a given database will import in significantly less time if you do it on a higher performance tier.
The exact timings that you achieve will of course depend on your specific database. To help establish those timings you can use a script that I created to collect the data points that went into the above graph – I’ve posted it up on GitHub under the Readify account.
AzureSqlPerformance is a PowerShell module that takes a bunch of arguments and uses them to create a Azure SQL Server, and Database at a specified performance tier, then kicks off an import and times how long (to about 10 seconds accuracy) it takes for the import to complete.
Feel free to check the script out, and if you can make some improvements to the implementation submit a pull request. I kind of stopped working on it as soon as it did what I wanted🙂
By the way, you can find some of the other pet projects others at Readify have contributed to over at labs.readify.net where we are trying to collect some of these kinds of resources.