Tuesday, May 9, 2017

Extract Dynamics GP Document Attach Files From SQL Server and Save To Disk

By Steve Endow

(I have another post showing how to use .NET to extract Doc Attach files:  https://dynamicsgpland.blogspot.com/2017/05/extract-and-save-dynamics-gp-document.html)

Today I was asked if there was a way to read an image file from the Dynamics GP Document Attach table so that the image can be added to a report.  For instance, suppose a customer wants to display item images on an invoice.

I have done very little work with the blob / varbinary field type in SQL Server, so I didn't know how difficult it would be to do this.  I quickly did some research and testing, and while I don't have a complete solution for inserting images onto a report, I did test one method for extracting files from the Dynamics GP Document Attach table and saving them to disk.

From what I could tell, there are at least three standard SQL Server tools/commands that you can use to extract an image or file from a varbinary field.

1. OLE Automation Query – This looks pretty sketchy and appears to be very poorly documented.  I tried some samples and couldn’t get it to work.

2. CLR (.NET) inside of SQL – Appears to be a viable option, but requires enabling CLR on SQL, which I personally would try to avoid on a GP SQL Server if possible, so I didn't try this one yet

3. BCP – I was able to get this to work and it was surprisingly easy, but I don’t know how easy it will be to integrate into a report process


Since I was able to quickly get the BCP option to work, here are the commands I used.  If you are able to shell out to run BCP as part of a report or other process, this should work.  


For my testing, I attached a text file and a JPG image to a customer record, and was able to use these BCP commands to successfully extract both files and save them to disk.  

When I ran the BCP command, it asked me four questions, and then prompted me to save those settings to a format file.  I accepted all of the default options except for the header length—that needs to be set to 0, and then saved the file as the default bcp.fmt.



BCP "SELECT BinaryBlob FROM TWO..coAttachmentItems WHERE Attachment_ID = '88d9e324-4d52-41fe-a3ff-6b3753aee6b4'" queryout "C:\Temp\DocAttach\TestAttach.txt" -T -f bcp.fmt

BCP "SELECT BinaryBlob FROM TWO..coAttachmentItems WHERE Attachment_ID = '43e1099f-4e2b-46c7-9f1c-a155ece489fa'" queryout "C:\Temp\DocAttach\TestImage.jpg" -T -f bcp.fmt


I found that BCP was able to extract the files from the Document Attach table and save them, and I was able to then immediately open and view the files.  Dynamics GP does not appear to be compressing or otherwise modifying the files before saving them to the varbinary field, so no additional decoding is required.

Given how simple BCP is and how well it seems to work, I would probably opt for this approach over using .NET, either inside SQL or outside of SQL.  But if you are already developing a .NET app, then it's probably better to use .NET code to perform this extraction.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+



Friday, May 5, 2017

Updating .NET apps to support Authorize.net TLS 1.2 for Credit Card and ACH integrations

By Steve Endow

Authorize.net is disabling support for TLS 1.0 and 1.1 in production as of September 18, 2017.  As of that date, they will only support TLS 1.2.  You can read more here:

http://app.payment.authorize.net/e/es.aspx?s=986383348&e=1411090


I have a customer using a .NET integration with Authorize.net, so I reviewed my code and did some research on which protocols .NET supports and uses.

I reviewed my code and confirmed that I was not explicitly setting the TLS protocol version.  So I researched which versions of TLS were supported by which .NET versions, and how the version was chosen.

After reading a few posts on StackOverflow, I confirmed that .NET 4.5 does support TLS 1.2.  So that was good news.  After reading a few more posts, my understanding was that .NET auto negotiates the protocol with the server, so if Authorize.net requires TLS 1.2, I thought that my .NET app should work fine.

So I tested against the Authorize.net developer sandbox, which has already been set to require TLS 1.2, and to my surprise, I received a connection error.  I researched the error and confirmed that it was due to the TLS 1.2 requirement.  But if .NET 4.5 supports TLS 1.2 and is able to negotiate the protocol version, why would my connection fail?

My only guess is that Authorize.net, in addition to requiring TLS 1.2, Authorize.net have configured their systems to detect protocol negotiation requests and deny those connections.  My assumption is that this may be a measure to prevent surreptitious protocol downgrades, such as the POODLE vulnerability, which makes sense.

So I updated my application to include the following line of code:

System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;

(You will need a "using System.Net" reference as well in your class)

This explicitly tells System.Net to only use TLS 1.2.  After I added this line, my connections to the Authorize.net developer sandbox started working again.

Given this finding, I will need to prepare a new release and will have to work with the customer to deploy the new application before September 2017.

And one other small downside to this approach is that my application is now hard coded to TLS 1.2.  But in practical terms, I am not concerned about this, as TLS 1.2 is the latest version available in .NET 4.5.  If a new version of TLS is released, Authorize.net will probably require that specific protocol, and I'll need to update the .NET version of my application anyway, so I'll have to prepare a new release regardless.

UPDATE: And coincidentally, one after hour posting this article, I saw a discussion about TLS v1.3, which is apparently in the works:

https://github.com/tlswg/tls13-spec/issues/1001


So I learned a few things from this process, and fortunately the fix turned out to be very easy.



You can also find him on Twitter, YouTube, and Google+








Wednesday, April 26, 2017

Automatically generate and create batch IDs in Dynamics GP using VBA

By Steve Endow

This has probably been done a few thousand times by a few hundred people over the years, but I don't recall ever doing it before.

A customer wants to automatically generate batch IDs in Dynamics GP.  They don't want to have to type the batch ID or create batches when entering transactions.  They want to be able to open the Payables Transaction Entry window, have the Batch ID automatically generated and just starting entering a transaction.

Here is a link to a Word document with the instructions and the script.

           https://1drv.ms/w/s!Au567Fd0af9Tmh3tbpX2qrAXYfuY


Once the code is installed, if you open the Payables Transaction Entry window, a new batch is automatically created and the batch ID value is populated in the Batch ID field.


The customer didn't want to use the date, but wanted a sequential numeric batch ID that would be unique.  I used the username, plus the two digit year, plus the day of the year.  This ensures a unique batch for each day, and it's a way of using the date in the batch ID without using the full date.


Make sure to following the instructions in the Word doc and add a VBA Reference to Microsoft ActiveX Data Objects 2.8 Library in the VBA Editor.


Option Explicit

Dim conn As ADODB.Connection
Dim cmd As New ADODB.Command

Private Sub Window_BeforeOpen(OpenVisible As Boolean)
  Set conn = UserInfoGet.CreateADOConnection
  conn.DefaultDatabase = UserInfoGet.IntercompanyID
  cmd.ActiveConnection = conn
  cmd.CommandType = 1  'adCmdText
End Sub

Private Sub Window_AfterOpen()

    Dim interID As String
    Dim userID As String
    Dim shortYear As String
    Dim dayNumber As String
    
    Dim currentDate As Date
    Dim sqlDate As String
    Dim batch As String
    
    Dim sql As String
    
    interID = UserInfoGet.IntercompanyID  'Get the database name (INTERID)
    userID = UCase(UserInfoGet.userID)  'Get the GP user ID
    
    currentDate = DateTime.Now
    sqlDate = Format(currentDate, "yyyy-MM-dd")

    shortYear = Right(DateTime.year(currentDate), 2) '2 digit year
    dayNumber = DatePart("y", currentDate) 'Get day number (out of 365 / 366)
    batch = userID & "-" & shortYear & dayNumber  'Ex:  jdoe-17134
    
    'Create the batch if it does not already exist
    sql = "EXEC " & interID & "..taCreateUpdateBatchHeaderRcd '" & batch & "', '', 4, '" & sqlDate & "', 'PM_Trxent', 0, 1, 0, '', 0, 0, 1, 0, 0, 0, 0, '" & userID & "', 1, 0, 0, 0, '1900-01-01', '', '', '', '', '', '', '', 0, ''"
    cmd.CommandText = sql
    cmd.Execute
    
    'MsgBox (sql)
    
    'Assign the new batch ID to the batch field on the window
    batchID.Value = batch
    
End Sub



You can also find him on Twitter, YouTube, and Google+








Enhancing the reliability of complex Dynamics GP integrations

By Steve Endow

I had a call today with a customer who is implementing several complex Dynamics GP integrations.  One of the integrations involves the following steps:

1. A SmartConnect map imports a large AP Invoice batch once per day
2. Once the batch is fully imported, SmartConnect inserts a database record telling Post Master Enterprise that the batch is ready to post
3. Post Master Enterprise posts the AP Invoice batch
4. Once the batch is finished posting, Post Master Enterprise automatically fires a stored procedure where the customer will have custom SQL script
5. The stored procedure will verify the batch posted successfully and then call a SQL command to trigger a second SmartConnect map
6. SmartConnect will then import AP Manual Payments and apply the payments to the posted AP Invoice batch
7. Once SmartConnect is done importing the Payments and Applications, it will then insert another database record telling Post Master to post the AP Manual Payment batch

In theory, this process will work just fine, as both SmartConnect and Post Master can be configured to work together to coordinate in this manner.

But...

In any system integration with many steps or processes, there is a chance that one of the steps may not work properly, may encounter an error, may fail, may not fire, etc.

And in any system integration that is synchronous and tightly coupled like this one, any such problem in the chain of events can prevent the rest of the process from completing.  What if in Step 2, the SmartConnect map has a problem telling Post Master that the batch is ready to post?  What if in Step 3, Post Master finds that the Fiscal Period in Dynamics GP is closed, and it is unable to post the batch?  What if the batch fails to post due to a data error?  What if 682 out of 700 Invoices post successfully, but 18 fail to post due to invalid distributions? What if the custom SQL script in Step 4 encounters an error or table lock that prevents it from completing successfully?  What if SmartConnect encounters an error applying the Manual Payments to the posted Invoices?

In isolation, each of these issues is relatively small, but collectively, there are probably at least a dozen such minor problems that could potentially prevent the entire sequence from completing successfully.  Between the 7 plus different steps in this system integration and the dozen plus potential errors that can cause a failure, I would anticipate that this integration will have some reliability issues over time.

But that's okay.  That's what happens with complex systems--you often have equally complex failures and reliability challenges.

So how do you deal with this?  How do you manage such a complex integration?  And how do you increase reliability?

While speaking with the customer, a few things came to mind.

1. Error notifications.  Both SmartConnect and Post Master Enterprise can be configured to send notifications in case of an error.  As soon as a problem occurs, an email should be sent to a power user or administrator that has the tools to resolve the problem.

2. Proactive monitoring.  Sometimes problems occur, but notifications don't get sent or received, or are simply missed in the torrent of email that we all receive daily.  To supplement the error notifications, a monitoring job can be created that independently checks the status of the process.  Did the Invoices get imported by 1pm?  Did the Invoices get posted by 2pm?  Did the Manual Payments get imported and applied by 3pm?  Did the Manual Payments get posted by 4pm?  Each of these checks is just a simple SQL query against the Dynamics GP company database, and these checks serve as a second layer of notification in case there is a problem or delay.

3. Asynchronous design.  In my experience, synchronous, tightly coupled system integrations tend to be less reliable than asynchronous, loosely coupled integration.  So an integration could potentially be modified or redesigned to decouple one or more tasks in the chain, adopting more of a queue based integration rather than a real time integration.  In this particular integration, that would likely be challenging and would require a redesign.

4. Integration "Supervisor".  Rather than SmartConnect and Post Master working independently and simply handing off messages to each other, a "Supervisor" process or solution could be utilized that manages, or supervises, the entire process.  The Supervisor asks SmartConnect to run a map, monitoring that task until the import is complete.  It then hands a request to Post Master, monitoring that task until the posting is complete.  Rather than having to monitor two different applications and get notifications from each, this central supervisor service could potentially reduce the administration and monitoring of the overall process.

While these steps may help manage the process, they won't directly resolve the issues that cause failures or reliability issues in each step.  To increase reliability, I think you have to target the specific failures and eliminate the causes.  Increase data validation, add error handling, design each step to be able to proceed if some errors occur, and also utilize diagnostics to quickly track down the causes of errors.

There isn't a single answer for managing complex system integrations or improving reliability, but I think it is always a good idea to approach such complex integrations with an understanding of the potential errors and failures and an expectation that some additional measures may be required to increase reliability.

What have you done to make your complex system integrations easier to manage?  Have you used any techniques to increase overall reliability?

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+






Saturday, April 22, 2017

Importing Paid AP Invoices into Dynamics GP: Payment Numbers and Distributions

By Steve Endow

I recently developed a Dynamics GP import that created AP Invoices for employee expenses.  One thing that was different with this integration was that some of the employee expenses had already been reimbursed or paid, either with cash or by credit card.

It seemed pretty straightforward at first, but while designing the integration, I learned two interesting things.  When importing Paid AP Invoices into Dynamics GP using eConnect, you'll want to pay attention to the Payment Numbers and Distributions, as they will vary depending on whether the invoice was paid with Cash, Check, or Credit Card.

When importing an AP Invoice paid via Credit Card, on the Payables Credit Card Entry window, the Payment Number field should be populated with the next AP Voucher Number, not the next AP Payment Number.  And the Credit distribution for an invoice paid with a Credit Card should be Type of Cash, with the GL account being the AP account from the Credit Card vendor (my customer uses a different AP account for their Credit Card vendor).

Here's a video sharing what I learned about the proper values for Payment Numbers and Distributions when importing the paid invoices using eConnect:

https://www.youtube.com/watch?v=_DTK_jdzT1Y



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

http://www.precipioservices.com



Bug in GP ribbon buttons when AA is installed

By Steve Endow

Today I was doing some testing in GP 2016 R2, where I have Analytical Accounting installed, reviewing some imported AP Invoices.  I opened the invoice, clicked on Distributions, and then moved my mouse up to the ribbon to click on the Default button.

But...the Default button wasn't in the ribbon.


I only had View, Additional, File, and Tools buttons in the ribbon.  I was puzzled for a few seconds until I noticed the Delete and Default buttons were at the bottom of the window.

Monday, April 3, 2017

Dynamics 365 Financials Training - Day 4: Sales, Bank Rec, and GL

By Steve Endow

Today I attended day 4 of the Dynamics 365 Financials training class organized by Liberty Grove.

http://libertygrove.com/services/training/dynamics-365-for-financials-training-ndpm/


Here are links to my summaries of the prior days of the class:

Day 1 - Navigation, Company Setup, and Posting Groups

Day 2 - Dimensions, more Setup, and Financial Reporting

Day 3 - Payables and Inventory


Day 4 covered a lot of content and I found several items that I think are important to note, so this will be a long one.

And on a side note, I think I'm going to organize a contest to come up with some good abbreviations for the D365 products, since constantly typing out Dynamics 365 Financials gets old.  And I'm obviously not the only one with this issue:


I may use D365Fin or D3Fin or maybe even D3F eventually, but for now I've setup a keyboard shortcut in my FastKeys app that will automatically replace "d3" with "D365 Financials" as I type it.

And with that, let's get started.