Moving your SQL database to Azure SQL: Using the import method

Planning to move your SQL data to Microsoft Azure as part of your cloud projects? You have several options to migrate your SQL data, and the decision is going to be based on your project requirements. In this three-part series, we look at several ways to get the job done. As you already know, several variables will dictate how we are going to perform the migration of any given database. Some key points to consider when planning such a move of your SQL data:

  • Is it for test/dev environment?
  • Do you need to keep synchronization?
  • Do you want to assess to understand if improvements must be made before the migration?
  • Do you have a hybrid cloud where Azure has connectivity to your on-premises environment?
  • Is it a native cloud app (no ties to any on-premises environment)?
  • Do you want to upgrade schema or data to an existing database?

In this article, we are going to cover the import database method, which is used for single time migrations where we create a dacpac file of our current SQL database on-premises. Then we will store that file in a Storage Account, and from the Azure SQL server, we are going to import that file to a brand-new database.

In the articles that follow, we will focus on the Data Migration Assistant (DMA), which is a fantastic tool that performs an assessment and then the migration of the SQL schema and data from on-premises to Azure SQL.

Creating a Storage Account

The first step is to create a Storage Account in Azure. This Storage Account will be the location where we are going to export the SQL database, schema, and other components from our on-premises servers. Then we will use the SQL server blade to import and create a database as part of the process.

First things first: Let’s simple Storage Account. Logged in the Azure Portal, click on Create a resource, type Storage Account and select from the list. In the new blade, click on Create.

In the Basics tab of the new Create Storage Account wizard, select the resource group, Storage Account name, region, and you can leave default settings or fine-tune it based on your requirements. Click on Review + Create to skip some of the tabs, review the settings, and click on Create to start the provision of the new Storage Account.

After creating the Storage Account, click on it and then go to the containers item in the Blob Service area. We are going to add a new container by hitting the button + container, and in the new blade, type the name (in our case sql) and click on Create to complete the process.

If you already have the dacpac file, which we are going to obtain in the next section of this article, you can take advantage of the Azure Storage Explorer to connect directly to his newly created Storage Account and upload the file on your own.

To download the latest tool, you can always click on Open in Explorer icon located in the Overview item of any Storage Account.

Exporting the schema and data from your on-premises SQL server

Azure SQL allows the import of a dacpac file. The dacpac file encapsulate all components of the SQL server, including tables, logins, and objects related to the database to be stored in a single file, which makes the SQL database portable and easy to move from point A to point B.

Using SQL Server Management Studio, expand the Databases item, right-click on the database that we want to move to Azure SQL, then click on Tasks, and Export Data-tier Application…

On the Introduction page, just click Next.

On the Export Settings page, here is where we see how SQL server and Microsoft Azure are integrated. Select Save to Microsoft Azure, and click on the ellipsis besides Storage Account.

On the new window, click on Sign in and provide your credentials and MFA (if required). After getting authenticated, select the subscription, Storage Account, and container. We have created that infrastructure in the previous step. Click OK and Next.

On the Summary tab, review all the settings that we have defined so far, and click on Next to continue.

On the Results tab, we are going to have a complete list of all items that were exported as part of this process.

Creating a SQL database using the dacpac file

For the final portion of the process, we are going to use the SQL server overview blade. We have an option to Import database, which is located at the top menu of that same page.

The import database wizard will be displayed. We need to select the Storage Account that we have just used to save the dacpac file. Define a name for the new database that will be created (the wizard suggests the original name), and the SQL credentials.

The process may take a few minutes — time will vary based on the database size. Go for a coffee and wait for the confirmation on the notification area.

The result will be the new database being listed in the SQL databases section, as depicted in the image below.

Moving SQL data to Azure: More to come

If you are planning a simple cut migration where you shut down your database on-premises and migrate to Azure a single time, then the SQL import procedure may be a perfect fit for this type of migration. The same method can be used for dev/test scenarios where you need to test your application, and no constant synchronization is required with on-premises. But for more complex migrations, you may have to try a different approach. And that's where the next two parts of this series come in. Stay tuned!

Featured image: Shutterstock

Anderson Patricio

Anderson Patricio is a Canadian MVP in Cloud and Datacenter Management, and Office Server and Services, besides of the Microsoft Award he also holds a Solutions Master (MCSM) in Exchange, CISSP and several other certifications. Anderson contributes to the Microsoft Community with articles, tutorials, blog posts, twitter, forums and book reviews. He is a regular contributor here at Techgenix.com, MSExchange.org, ITPROCentral.com and Anderson Patricio.org (Portuguese).

Share
Published by
Anderson Patricio

Recent Posts

See the light: How to avoid webcam hacking

With so many employees video conferencing from home, the webcam may be a portal to…

3 days ago

Using Intel VTune Profiler performance analyzer on Hyper-V VMs

The Intel VTune Profiler performance analyzer can do more than monitor a system’s CPU utilization.…

3 days ago

The evolution of backup: Interview with Altaro’s Simon Attard

Backup is not the glitziest part of an IT pro’s job, but it may be…

4 days ago

U.S. Department of Veterans Affairs experiences data breach

A successful cyberattack initiated by a social engineering campaign has caused a data breach at…

4 days ago

How to turn off or restart Windows 10 updates: Step-by-step guide

In this article, we'll show you how to turn off or restart updates in Windows…

4 days ago

Five native Windows Admin Center extensions you need to know about

Windows Admin Center is becoming the tool of choice for managing Windows Server deployments. Here…

5 days ago