Thursday, April 21, 2016

Dynamics GP "posting date" vs. "posted date" confusion

By Steve Endow

I received a call from an experienced GP consultant who said his customer was having a serious problem with their posted batches.  When the client posted their batches on April 21, one of the batches had a posting date of April 20!  How is this possible!

This is probably one of the most common questions related to GP, and the similar date names also confused me when I first started working with GP.  I was a bit surprised when the GP consultant asked me to look into this "issue", as all GP consultants should know this distinction cold.

If you aren't 100% confident and 100% clear on the difference between a Document Date, Posting Date, and Posted Date in Dynamics GP, please, please, please read this brief post by Victoria Yudin.

https://victoriayudin.com/2009/09/08/whats-with-all-these-dates/


If you open GP on Thursday, April 21, 2016 and post a batch, that is your POSTED date.  The batch or transactions (depending on your configuration), can have a different POSTING date.  And to make things fun, the transactions in the batch can have completely different DOCUMENT dates.

Clear as mud?  Go read Victoria's article one more time.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Reprint Dynamics GP Posting Reports or Posting Journals

By Steve Endow

I've been meaning to write a post about reprinting posting journals as the topic comes up occasionally when I'm working with customers who use Post Master Enterprise to automatically post their Dynamics GP batches.

As with many Dynamics GP topics, someone has already written about it, and sure enough, there were several discussions and posts on reprinting posting reports / posting journals.

This article on iDriveLLC.com was one of the first articles that I found, and it appears to do a good job of covering the process for the different modules, so go check out their well written article.

http://idrivellc.com/?p=116



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Monday, April 18, 2016

Just One Example Why Dynamics GP System Integrations Are Not Always Easy

By Steve Endow

I recently developed a custom Dynamics GP eConnect system integration for a customer.  The customer did not want users to have to run the integration manually, so it needed to be fully automated, and they also wanted to ensure that users could not view, access, or manipulate any of the source data being imported into Dynamics GP.

Not a problem, it is fairly easy to address all of these requirements.  So I developed the integration, and asked the third party vendor who is providing the source data for some sample data files. I then waited for the sample data files.  2 months later, they produced a sample data file.  It had several issues, like a missing field, blank values, and some incorrect data in the fields.  The customer asked that they fix those issues.  Great.

But then I reviewed the CSV file.  It looked like this.






So, what do you notice about that sample data file?  Yes, for some reason it has a bunch of dashes in the second row.  They were nice enough to comma delimit the dashes, but they are dashes nonetheless.

When you create a data import or system integration, one of the most important tasks is to properly design your source data, whether it will be sent in files or stored in a database.  It is the lifeblood of any system integration.  Normally it's relatively straightforward, but occasionally it gets complicated or messy.  Or in this case, just odd.

I asked the third party to remove the dashes, since we obviously have no need for them.

Their response:  We can't.

They explained that the only way to remove the dashes is to remove the column headers as well.

Yes, you read that right.  In the year 2016, a fancy "cloud based" system is unable to produce a simple CSV export file with column headers and data, but without a bunch of useless dashes.

So you might say, "No big deal, just remove the dashes in your import or use a file without headers".

And that is exactly what needs to be done.  But remember that this integration needs to be fully automated and the customer does not want users to have access to the source data files.

So by introducing this seemingly small requirement, the third party has just increased the complexity of this integration to the point where it would be beyond the knowledge and skill of many end users.

If a customer were to create this integration on their own, does the customer know how to filter a data source?  Do they understand ordinal references?  Do they know to map field names to unlabeled columns?  If they use ordinal references, do they understand the increased maintenance that approach requires, such as when they need to add a new field?  This is a trivial example, and it isn't hard to work around this issue, particularly with experience or some assistance, but it does require some skill that may not be obvious to a non-technical user.

In this case, I'm developing the integration for the customer, so I will handle this annoying issue with the third party data file, but it's just one of a hundred seemingly trivial details that collectively can make System Integrations technically challenging.  I see these types of issues every day, and some can cause production integrations to stop working.

If you need a simple Data Import, such as GL accounts, customers, or vendors, I would encourage customers to try a Dynamics GP import using one of the common tools.  But just be aware that when you move into transaction imports and true System Integrations, you should expect to encounter several complications that collectively may require some skill and experience to address.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Monday, April 11, 2016

How do you comment your code?

By Steve Endow

It's one of those overused sayings, like "eat healthy" or "brush and floss regularly", but it's for developers:

"Comment your code."

It even applies to consultants who only occasionally dabble in SQL.  "Comment your code."

Okay, so we've all heard it before, but what does it mean to "comment your code"?

My philosophy around code comments is that they should try to answer the following questions:

1. When
2. By Whom
3. For Whom
4. What
5. Why

For example:

//4/11/2016: S. Endow:  Per Sally Smith, modify the freight calculation routine to include evaluation of dim weight to determine best shipping vendor to reduce shipping fee. Prior calculation didn't consider dim weight, resulting in oversize package fees for new Titan Series products when shipped through certain carriers.

When:  April 11, 2016

By Whom:  Steve Endow

For Whom:  Sally Smith

What:  Changed freight calculation routine

Why:  Old routine didn't consider new product line



So why do I try to cover these 5 points in comments?  After developing Dynamics GP integrations and customizations for the last 10+ years, I've realized that those were the questions that I needed to answer when trying to figure out why an application was behaving a certain way.

3 years after I deploy my code, I simply can't remember why specific changes were made.  After several years, I can look at the code and have no recollection even writing it.  I used to write comments that would answer When, By Whom, and What.  Over time, I learned that For Whom was very important:  "So where did that change come from? Who requested it?".  If I didn't comment that information, the client would think I made that change by my own accord, which is silly, but without a comment, I was unable to tell them who requested the change.

So then I started adding For Whom.  But then I would find comments that would tell me what I changed, and for whom I made the change, but we had no recollection of why we made the change.

Why did we change the overtime calculation to exclude those pay codes?  Why is the code handling company A different from company B?

The Why became one of the most important questions that remained unanswered in my comments.

I still have a hard time remembering to add the Why all of the time.  At the time you make the change, it seems so obvious, so it's very easy to forget to document the Why.  But I've found that "Why" is often the most crucial part of a comment when you have to decipher your code years later.

Given my criteria for code comments, after writing thousands of comment lines, I finally got sick of typing the same thing over and over again:  the date and my name.  So I installed a simple "paste" macro tool that will automatically type out some text when I press a keyboard shortcut.

So I've programmed my Win+C keyboard combination to output my basic comment start:  //4/11/2016: S. Endow:


It seems trivial, but it all adds up, and by using the macro, it makes it that much easier for me to add comments, encouraging me to do so.

So I then had a pretty good practice of adding meaningful, useful, and valuable comments throughout my code.  But what about release notes?  In a Visual Studio solution with thousands of lines of code in numerous classes in multiple projects, where do you put your main, summary, release notes?

At one point, I would put them at the top of my Form Main code.  But that didn't work for console apps.  So for console apps, I put them at the top of Program.cs.  But then what about class libraries that have neither a Main Form nor a Program.cs?  This approach starts to break down quickly.

So I did some searching to see if anyone had a best practice.  After reading a few suggestions on StackOverflow, I settled on Documentation.cs.


Every time I create a solution, I create a Documentation.cs class that just contains my release notes and related release comments.  At first this felt pretty strange.  An empty class with only comments?  Why not a text file or some other type of file?  Well, because a cs file is easy and will always be picked up by source code control.

If I have multiple projects in my solution, my Documentation.cs will be in my Library project.

After using Documentation.cs for a few months, I now love it.  It's very simple and easy and is just something I do without thinking about it.  And I know exactly where to look for release comments and functionality changes.

When I put comments in Documentation.cs, I have no reservations about being verbose.  Lay it all down with as much context and background and detail as possible, because 1, 2, or 3 years later, you'll need every clue you can get to troubleshoot or refactor.

This is an example of the comments I put in my Documentation.cs file.


I start with the date, then list the version of that release, then list every change that I made, sometimes referencing the specific class and method that was changed.

Every time I write code, and every time I comment my code, I'm constantly thinking, "What would an experienced developer think if she saw my code?"  The code I write is just as much my deliverable as my deployment guide or user documentation.  It's the ultimate end product.  Except that very few people other than me actually look at it.

But when you get a call from a customer who suddenly gets an error with the solution that you deployed four years ago, you'll be the one who is scrutinizing that code in far more detail than any customer would.  And you'll appreciate your detailed code comments.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

http://www.precipioservices.com







Quote of the day from Mark Polino














http://mpolino.com/gp/dynamics-gp-land-econnect-error-110-updating-existing-sop-order-item-
numberlocation-code-not-exist-inventory/

Sunday, April 10, 2016

eConnect Error 110 when Updating Existing SOP Order: Item number/location code does not exist in inventory

By Steve Endow

This is a pretty obscure eConnect issue that had me puzzled for several minutes.

I'm working on an upgrade of a client's existing eConnect integration.  One component of the integration involves updating existing SOP Orders in Dynamics GP.

When the integration code submits the order update to eConnect on my development server, I get this error:

Error Number = 110  Stored Procedure= taSopLineIvcInsert  Error Description = Item number/location code does not exist in inventory
Node Identifier Parameters: taSopLineIvcInsert
SOPNUMBE = TEST001
SOPTYPE = 2
LNITMSEQ = 16384
Related Error Code Parameters for Node : taSopLineIvcInsert
LOCNCODE = 
NONINVEN = 0
ITEMNMBR = E100


The item is setup properly in GP and it is assigned to the correct inventory site, as it is on the existing order.

For reference, here is the relevant XML that is being submitted to eConnect.

< ?xml version="1.0" encoding="utf-8"? >
< eConnect >
< SOPTransactionType >
< taSopLineIvcInsert_Items >
< taSopLineIvcInsert >
< SOPTYPE >2< /SOPTYPE >
< SOPNUMBE >TEST001< /SOPNUMBE >
< CUSTNMBR >108404< /CUSTNMBR >
< DOCDATE >2016-04-10< /DOCDATE >
< LOCNCODE / >
< ITEMNMBR >E100< /ITEMNMBR >
< AutoAssignBin >1< /AutoAssignBin >
< UNITPRCE >0< /UNITPRCE >
< XTNDPRCE >0< /XTNDPRCE >
< QUANTITY >0< /QUANTITY >
< FUFILDAT >2016-03-24< /FUFILDAT >
< ACTLSHIP >2016-03-24< /ACTLSHIP >
< QTYCANCE >0< /QTYCANCE >
< QTYFULFI >1< /QTYFULFI >
< ALLOCATE >0< /ALLOCATE >
< UpdateIfExists >1< /UpdateIfExists >
< QtyShrtOpt >2< /QtyShrtOpt >
< /taSopLineIvcInsert >
< /taSopLineIvcInsert_Items >


Notice that the LOCNCODE value is blank.  Since the location code (site ID) isn't being updated or changed, it isn't being sent into the order update.  I would assume that eConnect would just update the item and figure out the existing site ID for the item.

But that doesn't seem to be the case.

After checking my order, my item setup, my site setup, and everything else I could imagine, I assumed that there was some default Site ID that was missing in my dev environment.

After searching for a while, I found that in my development environment, where I'm testing with the Fabrikam company, there is a SOP default site of WAREHOUSE.


When I changed that default Site ID to match the site ID of the item on the order being updated, the error went away.

I realized that my test inventory item was only setup for one test site, and was not setup with the WAREHOUSE site ID.  I added the item at my test site and that resolved the issue as well.

So, it seems that when updating an order, if the LOCNCODE value is not sent to eConnect, the item must be assigned to the default site under SOP Setup.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Friday, April 8, 2016

Convert Dynamics GP DEX_ROW_TS date value to local time using SQL

By Steve Endow

Back in 2014, the venerable Tim Wappat wrote an excellent blog post on the DEX_ROW_TS field that exists in some Dynamics GP tables.

http://www.timwappat.info/post/2014/12/04/Beware-DEX_ROW_TS-for-data-synchronisation

Occasionally, I have needed to check the TS field to see when a record was inserted or updated.  The downside is that in doing so, I am usually frustrated by the fact the timestamp is recorded in UTC.

There are a metric ton of pitfalls related to date and time conversion, with time zones and daylight saving time being the most common, but I finally took the time to lookup a basic SQL function that can be used to convert the UTC time value into the approximate time of the SQL Server.

Please note that there are likely many potential technical issues with this approach, so you shouldn't assume the resulting local time is 100% correct for all dates and times, but I suspect in most cases, it will be sufficient for the typical situation where you are trying to research an issue in GP.  If you need perfectly accurate time stamps for reporting purposes, you'll need something more comprehensive, like a .NET based solution.

Here is the post from StackOverflow where the solution was posted, and you can read the many comments and arguments about how different approaches have various limitations.

With all of those disclaimers, here's an example of the quick and dirty function:

SELECT DEX_ROW_TS, DATEADD(mi, DATEDIFF(mi, GETUTCDATE(), GETDATE()), DEX_ROW_TS) AS LocalTime
FROM GL10000


Enjoy!

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter