Friday, April 29, 2016

What makes a Dynamics GP integration simple or complex?

By Steve Endow

I just finished upgrading one of the most complex Dynamics GP integrations I've ever seen.  And I've worked on some pretty complex integrations, so that's saying something.

On the surface it might sound simple:  Send Dynamics GP SOP Orders to third party logistics (3PL) warehouse, and receive shipping updates from the 3PL warehouse.

No big deal, right?

The integration doesn't involve a bunch of different transaction types.  It doesn't process millions of transactions.  From the perspective of Dynamics it was quite simple--just a basic SOP Order integration.  So what made it one of the most complex integrations I've ever worked on?

The data source.

The data source is the foundation for a system integration.  It is the starting point of any system integration design.  And based on the integrations (developed by others) that I've seen at most Dynamics GP customers, it is one of the most neglected aspects of most system integrations.  If you see an integration where a user has to perform 20 manual steps to prepare a data file for import to GP, that's an example where the data source was neglected.

The data source is where you start your discussions when you are designing your integration, and it may be where you spend a majority of your time when creating the integration.

If an internal system is automatically generating a CSV file containing the exact fields you need with perfectly clean data and saving it to an easily accessible network share, life is good.  In that scenario, your data source has been handed to you on a silver platter--you just need to get the data into GP, which is the easy part.

But system integrations are rarely that easy.  For an example, see my other post where I show how a simple formatting issue in a source data file can instantly complicate an otherwise simple integration.

So what made my recent project so complex?

The first "data source" for this integration was a SOAP web service hosted externally by the 3PL company.  So all new Dynamics GP orders had to be sent to the 3PL web service.  And that SOAP web service had it's own strange proprietary XML format (what XML format isn't strange and proprietary?).  So there's one complex data source.

But it wasn't just one "data source", as this was a two-way real-time integration.  So after an order had been sent to the 3PL and subsequently shipped, the 3PL would send back a shipment notification.  To receive the shipment notification, the Dynamics GP customer had to have a SOAP web service of its own to receive the proprietary XML message from the 3PL warehouse.


Because we have to send and receive an XML file format that is completely different than what eConnect uses, the developers chose to use XSLT to transform the XML.  This is a perfectly good use of XSL, but if you've ever worked with XSL, you probably also get headaches whenever you have to look at an XSL file.  Unless you work with it regularly, it's a mind bendingly arcane format, and these days I dread having to use it.

Because the developers were apparently into cool tech, they decided to also throw MSMQ into this already complex mess.  MSMQ is a great way to add unnecessary complexity, make it very hard for the customer to see what the integration is doing, and make it even more difficult to troubleshoot issues.  Brilliant choice.  Bravo, developers.

So to recap, here is what this integration required:

1. Custom eConnect Requester triggers
2. Customized eConnect procedures (it was fun trying to figure this one out)
3. eConnect GetEntity
4. Outbound XML transformation with XSLT
5. MSMQ Outbound message queue
6. Windows Service to check for new MSMQ messages and submit them to 3PL SOAP web service
7. Custom IIS SOAP web service to receive shipment notifications from 3PL
8. MSMQ Inbound message queue
9. Inbound XML transformation with another XSLT
10. eConnect SOP Order update

While doing all of this, the developers chose to only log to the Windows Application Event Log.  So if the customer happened to notice that orders weren't being sent or shipment notifications weren't being received, they had to login to the server and dig through piles of Windows event log entries to try and find out what was going on.  Or they could try and look at the pending messages in MSMQ, but that wouldn't explain what the problem was, only that messages were pending.

And in their apparent rush to push off this mess to the customer, the developers never enabled the email notifications that they half-coded.  I saw partial remnants of code referencing email, but it was incomplete and not being used.

But wait, there's more!

The crowning achievement was that for this monstrosity of an integration, there was no documentation.  There was only a single page with a partial list of the items that were required to install the integration.

As a result, it took me about 65 hours to make sense of this mess, upgrade the integration to GP 2015, remove MSMQ, add text file logging, and enable email integration.  I had to gather all of the pieces, review the many chunks of code, figure out how all of the code worked, and then figure out all of the ancillary pieces like the IIS and custom eConnect configuration and modified procedures.  And it was quite difficult, causing me to pull out every bit of knowledge I had, from reviewing IIS log files to port bindings to Windows authentication to eConnect debugging and reverse engineering.  It was rough.

Without understanding the details, someone might say, "Well, why not replace the whole thing with something better or easier or an integration tool?".  If anyone is still thinking that at this point, then I'm afraid they've missed the point.

Remember, it's the data sources that started this whole mess.  A bi-directional 3PL external SOAP web service.  No magic integration or tool is going to get around the problem caused by that data source.  The 3PL doesn't provide a RESTful API or JSON--believe me, that's the first question I asked.  So we're stuck with the need to send XML to a SOAP web service and the need to have a SOAP web service of our own to receive XML.  Sure, you could probably find an enterprise grade integration tool that could host such services, but you'll be spending over $10,000 to purchase and configure such a tool.

So something to keep in mind when evaluating Dynamics GP integrations:  What are your data sources?

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Thursday, April 21, 2016

Dynamics GP "posting date" vs. "posted date" confusion

By Steve Endow

I received a call from an experienced GP consultant who said his customer was having a serious problem with their posted batches.  When the client posted their batches on April 21, one of the batches had a posting date of April 20!  How is this possible!

This is probably one of the most common questions related to GP, and the similar date names also confused me when I first started working with GP.  I was a bit surprised when the GP consultant asked me to look into this "issue", as all GP consultants should know this distinction cold.

If you aren't 100% confident and 100% clear on the difference between a Document Date, Posting Date, and Posted Date in Dynamics GP, please, please, please read this brief post by Victoria Yudin.

https://victoriayudin.com/2009/09/08/whats-with-all-these-dates/


If you open GP on Thursday, April 21, 2016 and post a batch, that is your POSTED date.  The batch or transactions (depending on your configuration), can have a different POSTING date.  And to make things fun, the transactions in the batch can have completely different DOCUMENT dates.

Clear as mud?  Go read Victoria's article one more time.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Reprint Dynamics GP Posting Reports or Posting Journals

By Steve Endow

I've been meaning to write a post about reprinting posting journals as the topic comes up occasionally when I'm working with customers who use Post Master Enterprise to automatically post their Dynamics GP batches.

As with many Dynamics GP topics, someone has already written about it, and sure enough, there were several discussions and posts on reprinting posting reports / posting journals.

This article on iDriveLLC.com was one of the first articles that I found, and it appears to do a good job of covering the process for the different modules, so go check out their well written article.

http://idrivellc.com/?p=116



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Monday, April 18, 2016

Just One Example Why Dynamics GP System Integrations Are Not Always Easy

By Steve Endow

I recently developed a custom Dynamics GP eConnect system integration for a customer.  The customer did not want users to have to run the integration manually, so it needed to be fully automated, and they also wanted to ensure that users could not view, access, or manipulate any of the source data being imported into Dynamics GP.

Not a problem, it is fairly easy to address all of these requirements.  So I developed the integration, and asked the third party vendor who is providing the source data for some sample data files. I then waited for the sample data files.  2 months later, they produced a sample data file.  It had several issues, like a missing field, blank values, and some incorrect data in the fields.  The customer asked that they fix those issues.  Great.

But then I reviewed the CSV file.  It looked like this.






So, what do you notice about that sample data file?  Yes, for some reason it has a bunch of dashes in the second row.  They were nice enough to comma delimit the dashes, but they are dashes nonetheless.

When you create a data import or system integration, one of the most important tasks is to properly design your source data, whether it will be sent in files or stored in a database.  It is the lifeblood of any system integration.  Normally it's relatively straightforward, but occasionally it gets complicated or messy.  Or in this case, just odd.

I asked the third party to remove the dashes, since we obviously have no need for them.

Their response:  We can't.

They explained that the only way to remove the dashes is to remove the column headers as well.

Yes, you read that right.  In the year 2016, a fancy "cloud based" system is unable to produce a simple CSV export file with column headers and data, but without a bunch of useless dashes.

So you might say, "No big deal, just remove the dashes in your import or use a file without headers".

And that is exactly what needs to be done.  But remember that this integration needs to be fully automated and the customer does not want users to have access to the source data files.

So by introducing this seemingly small requirement, the third party has just increased the complexity of this integration to the point where it would be beyond the knowledge and skill of many end users.

If a customer were to create this integration on their own, does the customer know how to filter a data source?  Do they understand ordinal references?  Do they know to map field names to unlabeled columns?  If they use ordinal references, do they understand the increased maintenance that approach requires, such as when they need to add a new field?  This is a trivial example, and it isn't hard to work around this issue, particularly with experience or some assistance, but it does require some skill that may not be obvious to a non-technical user.

In this case, I'm developing the integration for the customer, so I will handle this annoying issue with the third party data file, but it's just one of a hundred seemingly trivial details that collectively can make System Integrations technically challenging.  I see these types of issues every day, and some can cause production integrations to stop working.

If you need a simple Data Import, such as GL accounts, customers, or vendors, I would encourage customers to try a Dynamics GP import using one of the common tools.  But just be aware that when you move into transaction imports and true System Integrations, you should expect to encounter several complications that collectively may require some skill and experience to address.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Monday, April 11, 2016

How do you comment your code?

By Steve Endow

It's one of those overused sayings, like "eat healthy" or "brush and floss regularly", but it's for developers:

"Comment your code."

It even applies to consultants who only occasionally dabble in SQL.  "Comment your code."

Okay, so we've all heard it before, but what does it mean to "comment your code"?

My philosophy around code comments is that they should try to answer the following questions:

1. When
2. By Whom
3. For Whom
4. What
5. Why

For example:

//4/11/2016: S. Endow:  Per Sally Smith, modify the freight calculation routine to include evaluation of dim weight to determine best shipping vendor to reduce shipping fee. Prior calculation didn't consider dim weight, resulting in oversize package fees for new Titan Series products when shipped through certain carriers.

When:  April 11, 2016

By Whom:  Steve Endow

For Whom:  Sally Smith

What:  Changed freight calculation routine

Why:  Old routine didn't consider new product line



So why do I try to cover these 5 points in comments?  After developing Dynamics GP integrations and customizations for the last 10+ years, I've realized that those were the questions that I needed to answer when trying to figure out why an application was behaving a certain way.

3 years after I deploy my code, I simply can't remember why specific changes were made.  After several years, I can look at the code and have no recollection even writing it.  I used to write comments that would answer When, By Whom, and What.  Over time, I learned that For Whom was very important:  "So where did that change come from? Who requested it?".  If I didn't comment that information, the client would think I made that change by my own accord, which is silly, but without a comment, I was unable to tell them who requested the change.

So then I started adding For Whom.  But then I would find comments that would tell me what I changed, and for whom I made the change, but we had no recollection of why we made the change.

Why did we change the overtime calculation to exclude those pay codes?  Why is the code handling company A different from company B?

The Why became one of the most important questions that remained unanswered in my comments.

I still have a hard time remembering to add the Why all of the time.  At the time you make the change, it seems so obvious, so it's very easy to forget to document the Why.  But I've found that "Why" is often the most crucial part of a comment when you have to decipher your code years later.

Given my criteria for code comments, after writing thousands of comment lines, I finally got sick of typing the same thing over and over again:  the date and my name.  So I installed a simple "paste" macro tool that will automatically type out some text when I press a keyboard shortcut.

So I've programmed my Win+C keyboard combination to output my basic comment start:  //4/11/2016: S. Endow:


It seems trivial, but it all adds up, and by using the macro, it makes it that much easier for me to add comments, encouraging me to do so.

So I then had a pretty good practice of adding meaningful, useful, and valuable comments throughout my code.  But what about release notes?  In a Visual Studio solution with thousands of lines of code in numerous classes in multiple projects, where do you put your main, summary, release notes?

At one point, I would put them at the top of my Form Main code.  But that didn't work for console apps.  So for console apps, I put them at the top of Program.cs.  But then what about class libraries that have neither a Main Form nor a Program.cs?  This approach starts to break down quickly.

So I did some searching to see if anyone had a best practice.  After reading a few suggestions on StackOverflow, I settled on Documentation.cs.


Every time I create a solution, I create a Documentation.cs class that just contains my release notes and related release comments.  At first this felt pretty strange.  An empty class with only comments?  Why not a text file or some other type of file?  Well, because a cs file is easy and will always be picked up by source code control.

If I have multiple projects in my solution, my Documentation.cs will be in my Library project.

After using Documentation.cs for a few months, I now love it.  It's very simple and easy and is just something I do without thinking about it.  And I know exactly where to look for release comments and functionality changes.

When I put comments in Documentation.cs, I have no reservations about being verbose.  Lay it all down with as much context and background and detail as possible, because 1, 2, or 3 years later, you'll need every clue you can get to troubleshoot or refactor.

This is an example of the comments I put in my Documentation.cs file.


I start with the date, then list the version of that release, then list every change that I made, sometimes referencing the specific class and method that was changed.

Every time I write code, and every time I comment my code, I'm constantly thinking, "What would an experienced developer think if she saw my code?"  The code I write is just as much my deliverable as my deployment guide or user documentation.  It's the ultimate end product.  Except that very few people other than me actually look at it.

But when you get a call from a customer who suddenly gets an error with the solution that you deployed four years ago, you'll be the one who is scrutinizing that code in far more detail than any customer would.  And you'll appreciate your detailed code comments.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

http://www.precipioservices.com







Quote of the day from Mark Polino














http://mpolino.com/gp/dynamics-gp-land-econnect-error-110-updating-existing-sop-order-item-
numberlocation-code-not-exist-inventory/

Friday, April 8, 2016

Convert Dynamics GP DEX_ROW_TS date value to local time using SQL

By Steve Endow

Back in 2014, the venerable Tim Wappat wrote an excellent blog post on the DEX_ROW_TS field that exists in some Dynamics GP tables.

http://www.timwappat.info/post/2014/12/04/Beware-DEX_ROW_TS-for-data-synchronisation

Occasionally, I have needed to check the TS field to see when a record was inserted or updated.  The downside is that in doing so, I am usually frustrated by the fact the timestamp is recorded in UTC.

There are a metric ton of pitfalls related to date and time conversion, with time zones and daylight saving time being the most common, but I finally took the time to lookup a basic SQL function that can be used to convert the UTC time value into the approximate time of the SQL Server.

Please note that there are likely many potential technical issues with this approach, so you shouldn't assume the resulting local time is 100% correct for all dates and times, but I suspect in most cases, it will be sufficient for the typical situation where you are trying to research an issue in GP.  If you need perfectly accurate time stamps for reporting purposes, you'll need something more comprehensive, like a .NET based solution.

Here is the post from StackOverflow where the solution was posted, and you can read the many comments and arguments about how different approaches have various limitations.

With all of those disclaimers, here's an example of the quick and dirty function:

SELECT DEX_ROW_TS, DATEADD(mi, DATEDIFF(mi, GETUTCDATE(), GETDATE()), DEX_ROW_TS) AS LocalTime
FROM GL10000


Enjoy!

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter