I just finished upgrading one of the most complex Dynamics GP integrations I've ever seen. And I've worked on some pretty complex integrations, so that's saying something.
On the surface it might sound simple: Send Dynamics GP SOP Orders to third party logistics (3PL) warehouse, and receive shipping updates from the 3PL warehouse.
No big deal, right?
The integration doesn't involve a bunch of different transaction types. It doesn't process millions of transactions. From the perspective of Dynamics it was quite simple--just a basic SOP Order integration. So what made it one of the most complex integrations I've ever worked on?
The data source.
The data source is the foundation for a system integration. It is the starting point of any system integration design. And based on the integrations (developed by others) that I've seen at most Dynamics GP customers, it is one of the most neglected aspects of most system integrations. If you see an integration where a user has to perform 20 manual steps to prepare a data file for import to GP, that's an example where the data source was neglected.
The data source is where you start your discussions when you are designing your integration, and it may be where you spend a majority of your time when creating the integration.
If an internal system is automatically generating a CSV file containing the exact fields you need with perfectly clean data and saving it to an easily accessible network share, life is good. In that scenario, your data source has been handed to you on a silver platter--you just need to get the data into GP, which is the easy part.
But system integrations are rarely that easy. For an example, see my other post where I show how a simple formatting issue in a source data file can instantly complicate an otherwise simple integration.
So what made my recent project so complex?
The first "data source" for this integration was a SOAP web service hosted externally by the 3PL company. So all new Dynamics GP orders had to be sent to the 3PL web service. And that SOAP web service had it's own strange proprietary XML format (what XML format isn't strange and proprietary?). So there's one complex data source.
But it wasn't just one "data source", as this was a two-way real-time integration. So after an order had been sent to the 3PL and subsequently shipped, the 3PL would send back a shipment notification. To receive the shipment notification, the Dynamics GP customer had to have a SOAP web service of its own to receive the proprietary XML message from the 3PL warehouse.
Because we have to send and receive an XML file format that is completely different than what eConnect uses, the developers chose to use XSLT to transform the XML. This is a perfectly good use of XSL, but if you've ever worked with XSL, you probably also get headaches whenever you have to look at an XSL file. Unless you work with it regularly, it's a mind bendingly arcane format, and these days I dread having to use it.
Because the developers were apparently into cool tech, they decided to also throw MSMQ into this already complex mess. MSMQ is a great way to add unnecessary complexity, make it very hard for the customer to see what the integration is doing, and make it even more difficult to troubleshoot issues. Brilliant choice. Bravo, developers.
So to recap, here is what this integration required:
1. Custom eConnect Requester triggers
2. Customized eConnect procedures (it was fun trying to figure this one out)
3. eConnect GetEntity
4. Outbound XML transformation with XSLT
5. MSMQ Outbound message queue
6. Windows Service to check for new MSMQ messages and submit them to 3PL SOAP web service
7. Custom IIS SOAP web service to receive shipment notifications from 3PL
8. MSMQ Inbound message queue
9. Inbound XML transformation with another XSLT
10. eConnect SOP Order update
While doing all of this, the developers chose to only log to the Windows Application Event Log. So if the customer happened to notice that orders weren't being sent or shipment notifications weren't being received, they had to login to the server and dig through piles of Windows event log entries to try and find out what was going on. Or they could try and look at the pending messages in MSMQ, but that wouldn't explain what the problem was, only that messages were pending.
And in their apparent rush to push off this mess to the customer, the developers never enabled the email notifications that they half-coded. I saw partial remnants of code referencing email, but it was incomplete and not being used.
But wait, there's more!
The crowning achievement was that for this monstrosity of an integration, there was no documentation. There was only a single page with a partial list of the items that were required to install the integration.
As a result, it took me about 65 hours to make sense of this mess, upgrade the integration to GP 2015, remove MSMQ, add text file logging, and enable email integration. I had to gather all of the pieces, review the many chunks of code, figure out how all of the code worked, and then figure out all of the ancillary pieces like the IIS and custom eConnect configuration and modified procedures. And it was quite difficult, causing me to pull out every bit of knowledge I had, from reviewing IIS log files to port bindings to Windows authentication to eConnect debugging and reverse engineering. It was rough.
Without understanding the details, someone might say, "Well, why not replace the whole thing with something better or easier or an integration tool?". If anyone is still thinking that at this point, then I'm afraid they've missed the point.
Remember, it's the data sources that started this whole mess. A bi-directional 3PL external SOAP web service. No magic integration or tool is going to get around the problem caused by that data source. The 3PL doesn't provide a RESTful API or JSON--believe me, that's the first question I asked. So we're stuck with the need to send XML to a SOAP web service and the need to have a SOAP web service of our own to receive XML. Sure, you could probably find an enterprise grade integration tool that could host such services, but you'll be spending over $10,000 to purchase and configure such a tool.
So something to keep in mind when evaluating Dynamics GP integrations: What are your data sources?
Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics
GP Certified IT Professional in Los Angeles. He is the owner of Precipio
Services, which provides Dynamics GP integrations, customizations, and
automation solutions.