I've had several inquiries lately about how to create a "real time" integration between Dynamics GP and another system, and I've worked on several integrations over the last few months where clients wanted data integrated as quickly as possible.
I think there are at least three critical misconceptions around "real time" integrations, so I thought I would share some thoughts on the topic. By no means do I believe that I am "right" or that this is the only philosophy, but based on my experience and a few battle scars, I think my approach is pretty well grounded.
To give the discussion context, let's use an example. Acme Sales uses Dynamics GP, and a third party point-of-sale (POS) retail system. Dynamics GP is the system of record for all inventory items, and all item/site records. Whenever a user adds an inventory item to Dynamics GP, Acme would like to have that item automatically added to the POS system. Also, whenever a user adds an item/site record to Dynamics GP, Acme wants that information sent to the POS system so that the item can be sold at the appropriate stores.
During an initial meeting, the Acme sales manager tells the accounting and IT staff that as soon as either an item or item/site record is added to Dynamics GP, those records need to be transferred immediately to the POS system so that the item can be sold. The IT developer thinks "Okay, I'll create a real time integration between GP and the POS."
Naturally, the first thing the developer does is to put code on the Dynamics GP Item Maintenance window to detect when a new item is created. Simple, right? Just detect when the save button is clicked, and he can then immediately transfer the information to the POS. He then finds the Item Quantities Maintenance window and tries to put code there to detect when new Sites are assigned to Items.
He soon realizes that this approach may not work so well. Once a month, the sales department receives a list of 500-1000 new or changed items from one of their distributors. This list is imported into GP using an eConnect integration. Monitoring the Save event on the Item Maintenance window won't work, so he's not sure how he'll handle those items.
Next, while trying to put code on the Item Quantities Maintenance window, he realizes that the Site ID field is, well, not your average windows control. And the events to detect the Item + Site combination are more complex than he realized.
While testing his prototype, he realizes that the connection between the GP workstations and his development POS environment is not always 100% reliable. Sometimes the code on the GP window can't complete, causing the GP user to experience a long pause or hang, and sometimes the POS update fails with an error. Not only does this display a confusing error to the user, but it also means that the record saved to GP was "lost", and didn't make it to the POS system.
Realizing that the POS update might sometimes fail, he then embarks on an additional project that will synchronize the GP records with the POS records. And so he begins to slide down the steep and slippery slope of a "real time" integration.
Given this story as background, here are my interpretation of three critical misconceptions around "real time" integrations.
1) There is no such thing as a "real time" integration. This isn't a technical distinction, it's a business requirements distinction. Does Acme really need an inventory item, that took a user 60 seconds to enter into GP, transferred over to a POS system in less than 1 millisecond after the user clicks on the Save button? I seriously doubt it. If it took 1 second for the item to transfer to the POS, would that be acceptable? How about 30 seconds? How about 5 minutes? My point is that if you completely eliminate the Cartesian notion of "real time", you are left with a much more practical concept of latency. You will always have some latency in your integration--the question is how much latency is acceptable for your integration from a business requirements perspective? For the vast majority of businesses, I would argue that most integrations will work fine with at least 2-5 minutes of latency, and often much more. I prefer to use the term "near real time" integration, as it acknowledges that there will always be some latency.
2) A user interface is a terrible place to put integration code. When you are building a near real time integration, there seems to be a tendency to focus on the source of the data, capturing it as soon as possible so that it can be transferred. This leads developers to want to capture the data as it is being saved on a window. Give this a try in Dynamics GP and I think you will quickly see how difficult it is to reliably detect these types of events. What if the user clicks on the "X" to close the window and then clicks on Save on the resulting dialog instead of clicking the main Save button? What if the user clicks on the "next record" VCR button in the bottom left corner of the window and gets the Save dialog? You will learn that in Dynamics GP those two save dialogs can be different. And dont' forget that sometimes the same data can be modified in two different windows, like the Customer Maintenance and Customer Address Maintenance windows! And as in the Acme example, what if data is inserted without using the user interface? How will you detect new records? Instead of looking to the user interface, focus on the destination: the SQL database. No matter what the user does in the user interface, the data all ends up in SQL Server in one or more tables. Table activity can be easily monitored using the eConnect Requester or custom triggers.
3) Synchronous integrations don't work. As soon as you develop a complex integration that assumes that an entire integration process can be completed successfully in sequence, you will learn that there are inevitably going to be interruptions to that process. What if the network is flaky? What if the remote server or database or web service is down? What if a password is changed or directory permissions change causing an authentication failure? What if the process takes 5 seconds to complete instead of a half second? In these situations, a synchronous integration can either fail or timeout, potentially causing all sorts of problems like forcing a user to wait, throwing error messages, or causing data to be lost. It is best to always use some type of asynchronous step in the process that can handle an interruption at some point in the integration. This typically involves some type of queue that can store the data until it can be reliably integrated. In the case of the integration to the POS system, maybe that would be a database table that serves as a queue to store new item numbers or item / site records that have been added to GP (such as the eConnect_Out table). Or maybe you write the data out to a CSV file that can then be read by an import process on the POS system. There are also formal queueing platforms, such as Microsoft Message Queue, but these are often not necessary or are overkill for most Dynamics GP integrations.
Hopefully this discussion of these three misconceptions help you understand what to avoid in your integrations, since getting past these common pitfalls will save you alot of time and frustration. Once you get past them, there are many different design options that you can use to create some fast, robust, and low maintenance integrations.
1 comment:
Thanks. It cleared many of my doubts and misconceptions.
Post a Comment