Thursday, September 28, 2017

Back up your Dynamics GP SQL Server databases directly to Azure Storage in minutes!

My blog has moved!  Please visit the new blog at:  https://blog.steveendow.com/

I will no longer be posting to Dynamics GP Land, and all new posts will be at https://blog.steveendow.com

Thanks!



By Steve Endow

At the GP Tech Conference 2017 in lovely Fargo, ND, Windi Epperson from Advanced Integrators had a great session about Disaster Recovery. One topic she discussed was the ability to use the Dynamics GP Back Up Company feature to save SQL backups directly to Azure.


I think doing SQL backups to Azure is a great idea. There are countless tales of SQL backups not being done properly or being lost or not being retained, and having an option to send an occasional SQL backup to Azure is great.

But this option is a manual process from the Dynamics GP client application, it is not scheduled, and it does not use the "Copy-only backup" option, so the backups will be part of the SQL backup chain if the customer also has a scheduled SQL backup job.  So as Windi explained, it may be a great option for very small customers who can reliably complete the task manually on a regular basis.

But how about setting up a backup job in SQL Server that will occasionally send a backup to Azure?

It turns out that the process is remarkably easy and takes just a few minutes to setup and run your first backup to Azure Storage.

NOTE: From what I can tell, SQL backups to Azure are supported in SQL 2012 SP1 CU2 or later.  And it appears that the backup command syntax may be slightly different for SQL 2012 and 2014, versus a newer syntax for SQL 2016.


The hardest part is setting up your Azure account and creating the appropriate Azure Storage account.  It took me a few tries to find the correct settings.

First, you have to have an Azure account, which I won't cover here, but it should be a pretty simple process.  Here is the sign up page to get started:  https://azure.microsoft.com/en-us/free/

Once you have your Azure account setup and have logged in to the Azure Portal (https://portal.azure.com), click on the "More Services" option at the bottom of the services list on the left.  In the search box, type "storage" and a few options should be displayed.

I chose the newer "Storage Accounts" option (not "classic").  To pin this to your services list, click the star to the right.




When the Storage Accounts page is displayed, click on the New button at the top.


The Create Storage Account page will be displayed.



To create a new storage account, give the storage account a unique name and choose General Purpose.  I found that with SQL Server 2014, the Blob storage account type does not work, but it may work with SQL Server 2016.

Choose your Replication type.  The more comprehensive replication types will cost more, but I don't currently know what the prices are for each.  Here is an article describing the storage types.  Geo Redundant Storage is recommended if you want to recover your files in case a single data center is destroyed or inaccessible.

Choose an existing resource group or create a new one, and then choose a location for your storage account.  Once you have specified all of the settings, click on Create, and you can check the Pin to dashboard box to make it easy to access your account from your Azure dashboard.

It will take a few seconds for the storage account to be created.


Once it is setup, it will show as Available.


Click on the storage account to view the configuration.  On the left side, click on the Containers item under "Blob Service", then click on the + Container button to create a new storage container.


Give the new container a name and choose Private access level.


Once the container is created, click on "Access keys" on the left menu for the storage account.


Copy these keys and store them in a safe place.

Next, using your storage account name and one of your keys, create a "credential" on your local SQL Server.

CREATE CREDENTIAL azuresqlbackup
WITH IDENTITY= 'mygpbackups'
, SECRET = 'yourazurestorageaccountkeyhere=='

You can then use a simple backup script to perform the backup, referencing the SQL credential that you created.

This script uses the COPY_ONLY option so that it does not disrupt the backup chain of local database backup jobs, and it also uses the COMPRESSION option, which dramatically reduces the backup file size and significantly improves the backup performance.

BACKUP DATABASE[DYNAMICS] 
TO URL = 'https://mygpbackups.blob.core.windows.net/gpsqlbackups/2017-09-28 DYNAMICS.bak' 
/* URL includes the endpoint for the BLOB service, followed by the container name, and the name of the backup file*/ 
WITH CREDENTIAL = 'azuresqlbackup', COPY_ONLY, COMPRESSION;
/* name of the credential you created in the previous step */ 
GO

BACKUP DATABASE[TWO]
TO URL = 'https://mygpbackups.blob.core.windows.net/gpsqlbackups/2017-09-28 TWO.bak' 
/* URL includes the endpoint for the BLOB service, followed by the container name, and the name of the backup file*/ 
WITH CREDENTIAL = 'azuresqlbackup', COPY_ONLY
, COMPRESSION;    
/* name of the credential you created in the previous step */ 
GO  

Try running one of these backup scripts and if all goes well, SQL will send the bak file up to Azure.

I tested the backup on my GP 2016 virtual machine over my 100 mbit internet connection, and the performance was remarkable.

A 250MB DYNAMICS database was backed up in 13.6 seconds, and a 1.3GB TWO database was backed up in 96 seconds.


But that was without compression!  If I enable compression, the size of the backup file is greatly reduced, and the backup time drops dramatically.  From 18 seconds to 4.5 seconds for DYNAMICS, and from 96 seconds to 14 seconds for TWO!!!


My internet upload speed was maxed out at 120 megabits.


Based on my tests, I think using Azure Storage for SQL Server backups is a great option.  It doesn't have to be your primary daily backup repository, but I would definitely recommend that customers send a few weekly backups to Azure for safe keeping.



You can also find him on Twitter, YouTube, and Google+













No comments: