I've been working on a few different Dynamics GP integrations lately that seem to have a common theme. The Dynamics GP side of things is fine, but it's the external systems that are sending data to, or receiving data from Dynamics GP that have some glaring limitations. I understand that all software applications have limitations or are sometimes tasked to do things for which they weren't designed, but the issues I've run into recently seem to be pretty obvious limitations in the applications' interfaces--something that I actually don't run into very often.
So here are some lessons on how not to design an integration.
The first is an import from a widely used HR and payroll system to GP. I was asked to import employees from a CSV file into Dynamics GP. No problem, relatively straightforward in concept. We worked with the client to design the interface, and one of the questions we had was "How often do you want the employee information imported into Dynamics GP?" Because they wanted the ability to setup a new employee in GP on short notice, they asked that the import run at least hourly. Great, not a problem. We documented this requirement and moved on to other topics. Well, when we shared the requirements with the HR system vendor, we were told that the hourly import requirement was not possible. What? We aren't asking for near real-time, just hourly. We were told that the export from the HR system could only be run once per day. Period. End of conversation.
Well, from what I can glean, it seems that the HR system wasn't really designed to export new or changed data. The vendor had figured out a way to do it using extra tables to maintain copies of the employee table that were compared to the live table, but my understanding is that the design was based on a date stamp field. Not a date time stamp--just a date stamp. Being a GP developer, I'm not about to throw stones at any system that only uses a date stamp in their database tables (cough, cough, GP, cough--although GP 10 and 2010 are starting to sneak time stamps into some tables). But to design an export process that can only export once per day? What were they thinking? So, we're stuck with a once-per-day integration, at least until the client complains enough to see if the HR vendor will change their export routine.
The next integration involved exporting purchase orders from Dynamics GP and sending them to a third party logistics warehouse where the orders would be received. I had already developed the PO export with both e-mail and FTP capability--we were just adding a new warehouse into the mix. We prepared a sample PO export file and dropped it on their FTP site. They responded that our test file was invalid because it didn't comply with their undocumented file naming convention. Okay, no problem. Except here is the file naming convention:
AmmddssXYZ.CSV
The letter "A" represents a fixed prefix, and the XYZ is the fixed suffix--no problem. And mmdd is the month and day. But ss? We were told that ss represents a sequence number. And the file names had to be unique--a file name could not be repeated during a given day. So, we could either increment a sequence number, or we could use a two digit hour. Which means that we would be limited to exporting no more than once per hour, or even if we used a sequence, the client would not to be able to send more than 99 files per day. This GP site generates millions of sales and purchasing transactions per year, so 99 transactions per day in any category is a very small number. Fortunately, because these are purchase orders that won't need to be received immediately by the warehouse, hourly exports were okay. But in this day and age, a limit of 99 was surprising to see, especially since the limit only exists because of a two character limit within a file name.
The next one felt like a computing time warp. After submitting the same PO files to the warehouse for testing, we were told that our file "failed and caused a hard fault". An innocent little CSV file caused a "hard fault"? Should I even ask what that means? Is there such a thing as a "soft fault"? So, what caused this hard fault? A comma. Wait, a comma in a CSV file caused a failure? Yessiree. It wasn't a comma delimiter that caused the problem, it was a comma within a field value that cause the problem. It was a vendor name like "Company, Inc." Even though my CSV file had quotation marks around the value, the vendor's system was not designed to parse quotation marks in the CSV file, and is unable to handle commas that are not delimiters. So, in this vendors world, they apparently don't have any commas in their system--just pop that key off of the keyboard. I'm guessing that with some ingenuity, even a 1970s era mainframe could handle commas within quoted field values. But not this import routine. So I had to modify my standard CSV export routine to eliminate quotation marks and also remove commas from all field values in the CSV file. Brilliant.
If you are developing an internal integration that has very limited functionality, maybe these types of design limitations might be acceptable or go unnoticed, but both of these vendors have systems that have to interact with other external software applications. To not add that time stamp, to not add a third character to the file sequence number, and to not bother to parse quoted CSV field values just seems like poor integration design. Or negligent laziness.
On the other hand, I have a story about an interface that was reasonably designed, but still ran into issues. A Dynamics GP ISV solution was setup to export transactions to CSV files, which were then imported by an external system. The developer used a file naming convention that included year, month, day, hour, minute, and second--perfectly reasonable given the requirements and design assumptions that were provided. The integration worked fine--for a while. But after a few months, the client's volume increased to the point where so many transactions were being processed so frequently that files were being overwritten several times within a one second period. We realized that it isn't often that a typical business system needs to have a file naming convention that includes milliseconds, but for various reasons, this was one such situation. I forget the details, but I believe for some reason it wasn't practical to include milliseconds in the file name, or perhaps even the milliseconds were not good enough, so we ended up generating a random number that would be appended as a suffix on the file name. Even if 20 files were generated per second, or 20 in the same millisecond, it was unlikely that two would happen to get the same random suffix of "273056". Some times you just have to get creative.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
No comments:
Post a Comment