Thursday, May 11, 2017

Resolving the Veeam "Backup files are unavailable" message when restoring a VM

By Steve Endow

I'm a huge fan of the Veeam Backup and Replication product.  I've used it for several years now to backup my Hyper-V virtual machines to a Synology NAS, and it has been a huge improvement over the low-tech script based VM backups I was suffering with previously.

One quirk I have noticed with Veeam is that it seems to be very sensitive to any loss of connectivity with the backup infrastructure.  With a prior version, if Veeam was running but my file server was shut down, I would get error notifications indicating that it couldn't access the file server--even though backups were not scheduled to run.  I haven't noticed those messages lately, so I'm not sure if I just turned them off, or if I haven't been paying attention to them.

Since I don't need my servers running 24x7, I have them scheduled to shutdown in the evening, and then automatically turn on in the morning.  But sometimes if I wrap up my day early, I may shut down all of my servers and my desktop at, say, 8pm.  If I shut down my Synology NAS first, and Veeam detects that the file server is not accessible, it may log a warning or error.

Normally, this isn't a big deal, but I found one situation where this results in a subsequent error message.  I recently tried to restore a VM, and after I selected the VM to restore and chose a restore point, I received this error message.

Veeam Error:  Backup files are unavailable

When I first saw this message I was concerned there was a problem, but it didn't make sense because Veeam was obviously able to see the backup files and it even let me choose which restore point I wanted.  So I knew that the backup files were available and were accessible.

I could access the network share on my NAS file server and browse the files without issue.  I was able to click on OK to this error message, complete the restore wizard, and successfully restore my VM.  So clearly the backup files were accessible and there wasn't really an issue.

So why was this error occurring?

I submitted a support case to Veeam and spoke with a support engineer who showed me how to resolve this error.  It seems that whenever Veeam is unable to access the file share used in the Backup Infrastructure setting, it sets a flag or an error state in Veeam to indicate that the backup location is not available.  After this happens, you have to manually tell Veeam to re-scan the backup infrastructure in order to clear the error.  Fortunately, this is very simple and easy.

In Veeam, click on the Backup Infrastructure button in the bottom left, then click on the Backup Repositories page.  Right click on the Backup Repository that is giving the error, and select Rescan.

The Rescan will take several seconds to run, and when it is done, the "Backup files are unavailable" message will no longer appear when you perform a restore.  Or at least that worked for me.

Overall, I'm incredibly pleased with Veeam Backup and Replication and would highly recommend it if it's within your budget.

You can also find him on Twitter, YouTube, and Google+

Wednesday, May 10, 2017

Extract and Save Dynamics GP Document Attach Files Using .NET

By Steve Endow

In a prior post, I showed how to use BCP to extract and save Dynamics GP Document Attach files.  I decided to explore this further and use .NET to extract the GP Doc Attach files.

It is very easy to export the file attachments from SQL using .NET, but because you need some data access commands to read the data from the VarBinary field and separate write the data to a file using a stream, it's quite a few more lines than the very simple single-line BCP statement.

This MSDN forum post has the basic code sample, which is 9 lines.  So in theory you could have something that simple.  But in reality any .NET application of value is going to have a fair amount of additional code for a data access layer and other plumbing, allowing for parameters, processing multiple records, error handling, etc.

Very impressive SQL script for running multi-company Dynamics GP queries

By Steve Endow

I've previously posted some queries for retrieving values from multiple company databases.  I tried a few query designs, but ended up settling with cursors to loop through the different company databases and assemble the results.  Here are two examples:

Given the importance of the queries and how infrequently I used them, I didn't worry about the cursor or spend much additional time trying to figure out an alternative.

While these ad hoc, one-time queries are probably just fine using cursors, many SQL folks (myself included) are not fans of cursors, and it's often a fun challenge to figure out a way to use a proper set-based query to replace an iterating cursor in a query.

Well, Ian Grieve has come up with a very clever technique for assembling a query that can run against multiple Dynamics GP company databases.  Rather than executing a query multiple times and storing the results each time, he's figured out an intriguing method for dynamically assembling the queries for multiple companies, and then running that entire dynamic query string using sp_executesql.

Here is his blog post:

While I understand the general technique, I am not familiar with some of the commands and syntax--particularly the STUFF and FOR XMLPATH statements--so it is going to take me a while to deconstruct and fully understand the entire query.

But hats off to Ian for the creativity!  I think he's come up with a new standard template for multi-company Dynamics GP queries!

You can also find him on Twitter, YouTube, and Google+

Tuesday, May 9, 2017

Extract Dynamics GP Document Attach Files From SQL Server and Save To Disk

By Steve Endow

(I have another post showing how to use .NET to extract Doc Attach files:

Today I was asked if there was a way to read an image file from the Dynamics GP Document Attach table so that the image can be added to a report.  For instance, suppose a customer wants to display item images on an invoice.

I have done very little work with the blob / varbinary field type in SQL Server, so I didn't know how difficult it would be to do this.  I quickly did some research and testing, and while I don't have a complete solution for inserting images onto a report, I did test one method for extracting files from the Dynamics GP Document Attach table and saving them to disk.

From what I could tell, there are at least three standard SQL Server tools/commands that you can use to extract an image or file from a varbinary field.

1. OLE Automation Query – This looks pretty sketchy and appears to be very poorly documented.  I tried some samples and couldn’t get it to work.

2. CLR (.NET) inside of SQL – Appears to be a viable option, but requires enabling CLR on SQL, which I personally would try to avoid on a GP SQL Server if possible, so I didn't try this one yet

3. BCP – I was able to get this to work and it was surprisingly easy, but I don’t know how easy it will be to integrate into a report process

Since I was able to quickly get the BCP option to work, here are the commands I used.  If you are able to shell out to run BCP as part of a report or other process, this should work.  

For my testing, I attached a text file and a JPG image to a customer record, and was able to use these BCP commands to successfully extract both files and save them to disk.  

When I ran the BCP command, it asked me four questions, and then prompted me to save those settings to a format file.  I accepted all of the default options except for the header length—that needs to be set to 0, and then saved the file as the default bcp.fmt.

BCP "SELECT BinaryBlob FROM TWO..coAttachmentItems WHERE Attachment_ID = '88d9e324-4d52-41fe-a3ff-6b3753aee6b4'" queryout "C:\Temp\DocAttach\TestAttach.txt" -T -f bcp.fmt

BCP "SELECT BinaryBlob FROM TWO..coAttachmentItems WHERE Attachment_ID = '43e1099f-4e2b-46c7-9f1c-a155ece489fa'" queryout "C:\Temp\DocAttach\TestImage.jpg" -T -f bcp.fmt

I found that BCP was able to extract the files from the Document Attach table and save them, and I was able to then immediately open and view the files.  Dynamics GP does not appear to be compressing or otherwise modifying the files before saving them to the varbinary field, so no additional decoding is required.

Given how simple BCP is and how well it seems to work, I would probably opt for this approach over using .NET, either inside SQL or outside of SQL.  But if you are already developing a .NET app, then it's probably better to use .NET code to perform this extraction.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

Friday, May 5, 2017

Updating .NET apps to support TLS 1.2 for Credit Card and ACH integrations

By Steve Endow is disabling support for TLS 1.0 and 1.1 in production as of September 18, 2017.  As of that date, they will only support TLS 1.2.  You can read more here:

I have a customer using a .NET integration with, so I reviewed my code and did some research on which protocols .NET supports and uses.

I reviewed my code and confirmed that I was not explicitly setting the TLS protocol version.  So I researched which versions of TLS were supported by which .NET versions, and how the version was chosen.

After reading a few posts on StackOverflow, I confirmed that .NET 4.5 does support TLS 1.2.  So that was good news.  After reading a few more posts, my understanding was that .NET auto negotiates the protocol with the server, so if requires TLS 1.2, I thought that my .NET app should work fine.

So I tested against the developer sandbox, which has already been set to require TLS 1.2, and to my surprise, I received a connection error.  I researched the error and confirmed that it was due to the TLS 1.2 requirement.  But if .NET 4.5 supports TLS 1.2 and is able to negotiate the protocol version, why would my connection fail?

My only guess is that, in addition to requiring TLS 1.2, have configured their systems to detect protocol negotiation requests and deny those connections.  My assumption is that this may be a measure to prevent surreptitious protocol downgrades, such as the POODLE vulnerability, which makes sense.

So I updated my application to include the following line of code:

System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;

(You will need a "using System.Net" reference as well in your class)

This explicitly tells System.Net to only use TLS 1.2.  After I added this line, my connections to the developer sandbox started working again.

Given this finding, I will need to prepare a new release and will have to work with the customer to deploy the new application before September 2017.

And one other small downside to this approach is that my application is now hard coded to TLS 1.2.  But in practical terms, I am not concerned about this, as TLS 1.2 is the latest version available in .NET 4.5.  If a new version of TLS is released, will probably require that specific protocol, and I'll need to update the .NET version of my application anyway, so I'll have to prepare a new release regardless.

UPDATE: And coincidentally, one after hour posting this article, I saw a discussion about TLS v1.3, which is apparently in the works:

So I learned a few things from this process, and fortunately the fix turned out to be very easy.

You can also find him on Twitter, YouTube, and Google+