Wednesday, January 17, 2018

Dynamics GP Integrations: Eliminate the need for perfection

By Steve Endow

I had a call with a customer this morning to review an error they were seeing with their AP Invoice integration.  The custom integration is moderately complex, importing invoices from 90 retail locations into 30 different Dynamics GP companies, with Intercompany distributions.

There's an interesting twist to this integration.  The invoice data contains a retail store number, but it doesn't tell the import which GP company the invoice belongs to.  And there is nothing in the GP companies indicating which retail stores belong to which database.  Some retail stores have their own dedicated GP company database, while other retail stores are managed together in a GP company database.  The users just know which retail store belongs to which GP company.

So how does the integration figure out which company an invoice belongs to?

We could have created a mapping table, listing each retail store ID number and the corresponding GP company database.  But the problem with mapping tables is that they have to be maintained.  When a new retail store is opened, or a new GP database is created, users will invariably forget to update the custom mapping table.

So for this integration, I tried something new.  The one clue in the invoice data file is a GL account number.  The first segment of the GL account is a two digit number that uniquely identifies the Dynamics GP company.  Like this:

01 = Company A
03 = Company B
06 = Company C
25, 31, 49 = Company D

So, the integration reads the GL expense account assigned to the invoice, and uses that to determine which company the invoice belongs to.

When the integration launches, it queries all of the GP databases to determine which Segment 1 values are used in each database.

DECLARE @INTERID varchar(10) = ''
DECLARE @SQL varchar(MAX) = ''



       IF @SQL <> '' BEGIN SET @SQL += ' UNION '; END
       SET @SQL += ' SELECT ''' + @INTERID + ''' AS INTERID, (SELECT COUNT(DISTINCT ACTNUMBR_1) FROM ' + @INTERID + '..GL00100) AS Segment1Values, (SELECT TOP 1 ACTNUMBR_1 FROM ' + @INTERID + '..GL00100) AS CompanyID';



It is then able to use this "mapping" to match invoices to databases based on the GL expense account.

But, this scheme is based on the critical assumption that in Company A, every single GL account will always have a first segment value of 01.  And Company B will always have a segment 1 value of 03.  Or Segment 1 value of 25, 31, and 49 will only ever exist in Company D.  For every account.  No exceptions.

I'll let you guess what happens next.

A user enters a "06" account in Company A.  And another user enters a "01" account in Company B.

Despite the customer's insistence that this would never happen, and that they always make sure that only one unique Segment 1 value is used in each company, someone ends up entering a Segment 1 value in the wrong company.

Am I surprised by this?  Not at all.  Whenever the word "never" is used during integration design discussions, that's always a clue.  I substitute it with "usually" or "mostly".  There are almost always exceptions, whether intentional or unintentional.

So now what?  If the program can't ensure that the Segment 1 values are unique to each company, what can it do?

Well, the second layer is that during the import process, the integration checks the destination company database to verify that the GL account exists.  If it queries Company A for a 06 GL account and doesn't find it, it logs an error and that invoice isn't imported.  This is the error that was logged this morning.

But then what?  The customer insists, again, that they only use the 06 accounts in Company C, so the import must be wrong.  So we run the above query again and find that someone accidentally entered a 06 account in Company A, which confused the import.  And the customer is shocked that such a mistake could happen.  For the third time.

But I'm not satisfied with this process.  Because 6 months from now, it's going to happen again.  And they'll blame the integration again.  And we'll have to manually run the query again and find which account was added to the wrong company.

So let's just assume that this mistake is going to continue to happen and deal with it.  I'm thinking that I need to modify the integration to have it review the results of the query above.  If it finds that 06 is present in more than one GP database, it needs to log an error and let the user know.

"Hey, I found account 06-5555-00 in Company A. That doesn't look right. Please look into it."

This will proactively identify that an issue exists, identify the specific account, identify the company, and give the user enough information to research and resolve the problem.

It assumes the mistake will happen. It eliminates the need for perfection in a complex process in a moderately complex environment, where employees have 90 other things on their minds.  And it should only take a few lines of code--a one time investment that will save time for years into the future.

So why not do this for other possible exceptions and issues?  If you can identify other potential mistakes or errors, why not just code for all of them?  Because there are endless possible exceptions, and it would cost a fortune to handle all of them, most of which will never occur.

I usually release an initial version of an integration, identify the exceptions, and quickly handle errors that do occur.  When a new exception comes up, handle it.  It's usually a surprisingly small number, like 3 or 5 different data issues that cause problems.

So that's my philosophy: Eliminate the need for perfection in integrations whenever possible or practical.

Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

Friday, December 29, 2017

Implementing an Inbox Zero workflow using Outlook on Windows and iPhone

By Steve Endow

Uncle.  I give up.  I have lost the fight. 

Email has won.  I am defeated.

What was once a great tool for communication has become an overbearing hassle that has destroyed my productivity.

I receive around 50 to 75 emails every weekday.  On a very bad day, I'll hit 100 emails.  I've determined that 100 inbound emails a day is completely unmanageable for me.  With my current processes (or lack thereof), I cannot possibly be productive with that many emails coming at me.  The number of responses and tasks from 100 emails prevents me from doing any other work.

If all I did was "manage" my email all day, and do nothing else, I could probably wrangle my Inbox, but I wouldn't get any "real work" done.  When I focus on doing real work and ignore my email for a day, my Inbox explodes.

It isn't just the emails themselves.  It's also that many of the emails have some type of commitment attached to them.

"Hey Steve, please review this thread of 30 cryptic replies below and let me know what you think."

"Here's the 15 page document I created, please proofread it."

"When can you schedule a call?"

"We are getting an error.  What is causing this?"

"Here are links to a forum post and KB article. Does this explain the error I'm getting?"

"How many hours will it take you to do X?"

"I sent you an email earlier?  Did you get my email?  Can you reply to my email?"

People seem to expecting a relatively prompt reply to their emails--because they think their request is most important, naturally, and because I don't have any other work to do, right?

This week, a link to this article appeared in my Twitter feed:

One-Touch to Inbox Zero
By Tiago Forte of Forte Labs

I have heard of Inbox Zero previously, but I had dismissed it as a bit of a gimmick without fully understanding it.

This time, I actually read the article by Tiago Forte and his explanation finally clicked for me.  His examples and analogies made sense, and his emphasis on email as the first step of a more comprehensive communication and productivity workflow helped me build a new interpretation of Inbox Zero.

Thursday, December 21, 2017

Accepting help from experts and offering help as an expert

By Steve Endow

I've recently had two situations where someone asked for help with Dynamics GP, and when I provided guidance, the requester indicated that my suggestions were not relevant.  Without considering my suggestions or trying them, the requester immediately ruled them out.

They were simple suggestions, such as "please try making this change and perform the process again to see if that resolves the error", or "have you traced your source data to verify that it isn't the cause of the incorrect transaction that was imported?".

"That can't be the cause." was one response.

"My custom stored procedure that imports data into GP verifies everything, so I know it worked properly." was another response.

Another common response I receive when troubleshooting issues is, "We've already checked that and it's not the cause of the problem."

I don't consider myself an "expert" at anything, but there are some topics where I've done enough work to have a certain level of knowledge, intuition, and skills such that I'm generally able to narrow down causes to problems, and typically know some good places to start looking for causes.  I have enough successes solving problems in certain areas that it seems like my approach generally works.

When someone asks for help and then immediately dismisses my initial recommendations without even trying them, how can I help them?  Maybe they don't know who I am or what experience I have, and they're skeptical of my suggestions.  What can I do then?

Do I gently explain that I've worked with over 400 customers in this specific domain, and that my anecdotal statistics would not support the assertion that their integration is infallible or that Dynamics GP is at fault?  Is it my job to convince them that I tend to have a fairly good grasp of the subject matter and that they should reconsider my suggestion?  Is there any point in arguing with someone who has asked for help, but isn't accepting my help?

"Experts" don't know everything and can't always immediately pinpoint causes or solutions.  But if they ask questions, ask for more information, or ask you to test something, isn't it in your best interest to at least try working with them?  If you're not willing to work with an expert, what are your alternatives?

Instead of immediately ruling out suggestions, welcome them as opportunities to learn. Collect new data. Make new assessments. Understand what they are thinking.

Be inquisitive and curious and humble. Don't be defensive or righteous. This applies to the person asking for help, as well as the expert being asked.

Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

Wednesday, December 20, 2017

Building a Dynamics GP test environment on a B-series Azure Virtual Machine: Not so fast!

By Steve Endow

With the recent release of Dynamics GP 2018, I wanted to setup a new virtual machine that I could use for testing and development.

I currently run my own Hyper-V server, which serves up 20 different virtual machines, and has been very low cost and is extremely fast.  I would be happy to outsource my VMs to the "cloud", but having looked into the cost several times over the last few years, it just isn't economical for me.  I previously estimated it would cost me over $300 a month to host just a few VMs.  That cost, on top of having to severely limit the number of VMs I can run just didn't make sense for hosting my internal development VMs.

But recently fellow MVP Beat Bucher told me about a new Azure VM that was lower cost:  the B-Series "burstable" VMs.

Beat explained that he was able to run two of the B4ms machines continuously for a cost of roughly $150 per month.  I was intrigued.

After reviewing the different sizes, I setup a new B2ms virtual machine on Azure, running Windows Server.  The provisioning process was very simple, easy, and fast, and I had a VM a few minutes later.

I then downloaded and installed SQL Server and SQL Management Studio.  There were a few subtle hints that something wasn't quite right, but at the time the machine seemed great.

I then downloaded the 1.6 GB Dynamics GP 2018 DVD as a zip file.  Like when I downloaded SQL Server, I noticed that when I downloaded the GP 2018 zip file, the Chrome browser didn't show the download status.  When I opened Windows File Explorer, nothing showed up in the download directory during the download or after the downloaded appeared to complete.  It took quite a while for Windows File Explorer to show the downloaded file.

I noticed Windows File Explorer seemed unresponsive as well.  It just didn't feel right, but I hadn't yet pieced together the clues.

I then tried to unzip the GP 2018 file.  That's when it was clear something was wrong.

This status window appeared, showing that it would take over 30 minutes to extract the 1.6 zip file.  What??  1.36MB/s?

I then did dozens of other tests, simply copying large (1GB+) files on the C: drive and between the C: and D: temporary drive.  The performance was abysmal.

After several tests, I noticed that on average, the file copies were clearly being throttled around 21-22MB/s.

What in the world was going on?

The B-Series VMs are supposed to have "Premium SSD" storage, and 21MB/s is definitely not SSD performance.

I submitted an Azure support case and after several days, received a response.  The support rep admitted that because the B-Series VMs were relatively new, he didn't have much experience with them and would need me to do some tests to narrow down the cause.  No problem.

He first had me "redeploy" the Azure VM, which apparently pushes the VM to a new "node" or physical host machine.  I completed that process and tested again, but got the same results: file copies were still painfully slow.

He then had me install the Performance Insights plugin on the VM, which apparently runs some automated performance tests and automatically submits the results to the support case (a very cool feature).  I completed that process and a few days later, he emailed me with an explanation for the slow disk performance I was seeing.

This is the critical information that I overlooked when selecting the B-Series VM:

Notice that the B2ms size has a maximum disk speed of 22.5 MB/s.  That is the maximum.

The B4ms offers 35MB/s and the B8ms tops out at 50MB/s.  50 sounds a lot better than 22.5, but even 50MB/s is horrifically slow compared to any competent modern storage.

Even if you add an additional high performance Premium SSD, such as a 1023GB drive with 5,000 IOPS and 200MB/s throughput (which is VERY expensive), if it is attached to a B2ms VM, you will still be limited to 22.5 MB/s.

For comparison, my local Hyper-V server can copy files at 100MB/s from my NAS, and the limiting factor is the gigabit network connection between the NAS and the server, not my NAS or the SSDs in my server.

Local file copies on the SSDs on my Hyper-V server can be as high as 1GB/s!! It's so fast that I had a very hard time getting a screen shot while copying the 1.6GB Dynamics GP 2018 zip file.

If you are used to even half-decent disk performance on a server, can you live with 22.5 or 35 MB/s on an Azure B-Series VM?

And am I willing to spend an extra hour or two setting up an Azure B-Series VM, due to its brutally slow disk IO, for a Dynamics GP 2018 test environment?  Am I confident that once I set it up and don't have to do many large file copies, that the disk performance will be sufficient for my needs?

Can SQL Server actually run well enough on a disk throttled at 22.5MB/s?  Now that I see the disk specs, I am pretty sure that the B-Series was never intended to ever run SQL Server.

And I'm not willing to waste my time to find out.  Those disk speeds are so slow that I am not confident that the B-Series VM will meet my needs even for a test + development server.  Even if I used the B4ms, that's roughly $75 a month for a potentially painfully slow VM.

So, I have ruled out the B-Series Azure VMs for now, and would have to look at the "standard" VMs, which would likely still cost $150-$300 per month for 1-2 non-production VMs.

Since I have a very fast Hyper-V server in my office that can easily host 20 VMs with a marginal cost of $0 per month per VM, it seems that I will be sticking with an on premises server for at least a few more years.

Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

Friday, November 10, 2017

Free Precipio SFTP file transfer and data export tool - New Version 1.41 released

By Steve Endow

I have released a new version of my free SFTP file transfer and data export tool for Dynamics GP.

The new version 1.41 can be downloaded from my web site:


Version 1.41 includes the following enhancements:

  • Add support for optional SQLTimeout setting in config file to increase SQL command timeout
  • Set default SQLTimeout to 60 seconds if setting is not present in config file
  • Increase SFTP Connection Timeout from 5 seconds to 30 seconds, and Idle Timeout from 10 seconds to 30 seconds

The SQL Timeout setting allows for longer running queries, or queries that result in larger export files. 

The SFTP Connection Timeout was increased to accommodate some SFTP servers that might not complete the connection process in 5 seconds.

If you use the SFTP application, please let me know! I'd love to hear how you are using it and if it is working well for you.

You can also find him on Twitter, YouTube, and Google+

Dynamics GP BlackLine Integration Upload
Dynamics GP Coupa Integration Upload
Dynamics GP IQ BackOffice Integration Upload
Dynamics GP SFTP Integration Upload
Dynamics GP SFTP File Transfer Upload

Wednesday, November 1, 2017

Beware of UTC time zone on dates when importing data into Dynamics GP!

By Steve Endow

Prior to this year, I rarely had to deal with time zones when developing integrations for Dynamics GP.

The customer was typically using GP in a US time zone, the SQL Server was on premise in that time zone, and all of their data usually related to that same time zone.  Nice and simple.

Dynamics GP then introduced the DEX_ROW_TS field to several tables, and I would regularly forget that field used a UTC timestamp.  That was relatively minor and easy to work around.

But with the increasing popularity of Software As A Service (SaaS) platforms, I'm seeing more and more data that includes UTC timestamps.  I didn't think too much about this until today, when I found an issue with how a SaaS platform provided transaction dates in their export files.

Here is a sample data from a file that contains AP Invoices:


This is a typical date time value, provided in what I generically call "Zulu time" format.  Apparently this format is defined in ISO 8601.

The format includes date and time, separated by the letter T, with a Z at the end, indicating that the time is based on the UTC time zone.

So why do we care?

Until today, I didn't think much of it, as my C# .NET code converts the full date time string to a DateTime value based on the local time zone, something like this:

string docDate = header["invoice-date"].ToString().Trim();
DateTime invoiceDate;
success = DateTime.TryParse(docDate, out invoiceDate);
if (!success)
        Log.Write("Failed to parse date for invoice " + docNumber + ": " + docDate, true);

This seemed to work fine.

But after a few weeks of using this integration, the customer noticed that a few invoices appeared to have the incorrect date.  So an 8/1/2017 invoice would be dated 7/31/2017.  Weird.

Looking at the data this morning, I noticed this in the SaaS data file for the Invoice Date field:


Do you see the problem?

The SaaS vendor is taking the invoice date that the user in Colorado enters, and is simply appending "T06:00:00Z" to the end of all of the invoice dates.

Why is that a problem?

Well, when a user in Colorado enters an invoice dated 8/25/2017, they want the invoice date to be 8/25/2017 (UTC-7 time zone).  When the SaaS vendor adds an arbitrary time stamp of 6am UTC time, my GP integration will dutifully convert that date into 8/24/2017 11pm Colorado time.

For invoices dated 8/25, that may not matter too much, but if the invoice is dated 9/1/2017, the date will get converted to 8/31/2017 and post to the wrong fiscal period.

To make things even more fun, I found that the SaaS vendor is also storing other dates in local time.


So I have to be careful about which dates I convert from UTC to local time, and which ones I truncate the time to just get the date, and which ones are local time.  In theory, the .NET date parsing should handle the conversion properly, assuming the time zone is correct, but I now know that I have to keep an eye on the vendor data.

I will be contacting the vendor to have them fix the issue with the invoice dates--there is no good reason why they should be appending "T06:00:00Z" to dates.

Expect to see a lot more of this date format and related date issues as more customers adopt cloud-based solutions and services.

You can also find him on Twitter, YouTube, and Google+

Tuesday, October 24, 2017

Why I don't accept foreign checks (aka North American banking is a mess)

By Steve Endow

Several years ago, I received a paper check, in the mail, from a Dynamics partner in Canada.  The partner was paying my US dollar invoice, and thought they were doing me a favor by drafting the check from their US dollar bank account at their Canadian bank.

Send a check in US dollars to pay a USD invoice--makes sense, right?


I attempted to deposit the check at my local Bank of America branch using the ATM.  The ATM would not accept the check.  So I went inside the bank, stood in line, and then told the teller I wanted to deposit the check.  The teller looked at the check, and confusion ensued.

Eventually a manager came over and explained to me, with full confidence, and in no uncertain terms, that they were unable to accept the check.  He explained that the problem was not that the check was from a Canadian bank.  He said that the problem was that the Canadian check was issued in US Dollars.  He claimed that because the country of origin did not match the check currency, the branch could not accept the check.  That's the policy. (no it isn't) can I deposit the check?

The manager handed me a special envelope and a triplicate carbon copy form.  He said I needed to fill out the form and mail it with the check to a super special obscure department at Bank of America called "Foreign Clean Collections"--whatever that means.  Once the check is received by that department, it will review the check and coordinate with the foreign bank to get the funds transferred.  This process will take 6-8 WEEKS.

You're kidding me, right?  Nope.

So, being curious about this banking train wreck, I gave it a try.  I filled out the form and mailed the USD $1,000 check off to the super special department.

A few weeks later, a deposit shows up in my account for $800.  Yup, $200 less than the check.  In addition to having to wait several weeks for the deposit, I was charged $200 in bank fees!

After that nightmare, I stopped accepting any foreign checks.  I put a big red note on my invoice that says that I only accept credit cards and wire transfers from non-US customers. And guess what: That process has been working just fine for years.

This week, a Canadian partner didn't read my invoice, and didn't read my email with the invoice, and they mailed me a paper check.  The check is from their Canadian bank, issued in US Dollars.  Great.

So I contacted a colleague who regularly receives Canadian checks, and she said that she routinely deposits Canadian checks issued in USD at her local BofA branch without any issues.  Huh.

But having paid my $200 entrance fee to the Bank of America Foreign Clean Collections Club, I wasn't about to just deposit this new check, wait several weeks, and see how much I get charged.

So I did the obvious thing:  I called my local Bank of America branch.

First customer service rep:  "Sorry, I don't deal with those things. Let me transfer you to our back office."  Apparently the back office doesn't have voicemail and is out to lunch at 9am, as the phone rang for 3 minutes with no answer.  I tried calling the branch back, but this time nobody answered and I got a voice response system.  So the local bank branches are useless when inquiring about these things.

So I then called the main BofA customer service 800 number.  I spoke with someone who tried very hard to help, but she was unable to find any information and her computer and phone were unable to contact the department who might be able to help.  So she gave me the phone number to the Bank of America Foreign Exchange Call Center.

I then directly called the illustrious Foreign Exchange Call Center and spoke with someone who, for the first time, sounded like he understood the mysterious process of depositing foreign checks with Bank of America.

"Can I deposit this Canadian check drafted in US Dollars at my local California branch?", I asked

"Every check is reviewed on a case by case basis.", he replied

What?  What does that even mean?

"Every check is reviewed on a case by case basis.", he replied

So you have no consistent policy about depositing foreign checks?

"Yes, we have a very consistent policy that I just explained to you.  Every check is reviewed on a case by case basis.", he replied

After speaking with him for several minutes and apparently annoying him, here is my understanding of the official Bank of America policy / procedure for foreign checks.

1. Acceptance of a foreign check is completely up to the discretion of the BofA branch, and the inconsistent and incorrect training that a teller or branch manager may have received.  The branch can simply say they don't accept foreign checks. Or they can conjure up an excuse as to why they can't accept the check, like "the country of origin does not match the check currency".

2. If the branch is willing to try to accept the check, they can scan the check in their "system".  This "system" then determines if Bank of America is willing to accept the check at that branch.  Apparently this involves super secret algorithms about my "relationship" with the bank, the physical check, the bank that issued the check, the country of origin, the currency, the amount, etc. 

3. If the "system" determines that the branch can accept the specific check, apparently the check will be deposited in a fairly normal manner.

4. If the "system" determines that the branch cannot accept the check, then the magical process with the Foreign Clean Collections department kicks in, and you get the multi-part form, special envelope, a 6-8 WEEK processing time, and hundreds of dollars in fees that you will not be able to determine in advance.

5. The representative claimed that Bank of America only charges a flat $40 for the Foreign Clean Collections process, but that the issuing bank can charge their own fees for having to process the foreign check.  In my case, I was charged around USD $150 by the issuing Canadian bank just for the honor of cashing their USD check.  There is realistically no way for you to know how much the foreign bank will charge in advance.

6. I asked the representative how I was supposed to accept payments given the uncertainty and fees involved in this process.  He told me that they recommend wire transfers for foreign payments, and basically told me not to accept foreign checks.

What a shocking conclusion.

Naturally, I have received several responses from people saying that they accept foreign checks all the time at their bank and never have an issue.  Good for you, I say, enjoy the 1900s!  The Pony Express loves you!

I rarely receive such checks, don't want to have to drive to the bank to deposit them, and don't want to deal with clueless bank employees and the nightmare game-of-chance process outlined above.

Checks are a vestigial organ of banking and are a testament to the absurdly anachronistic North American banking system.  Talk to someone from any country with a modern banking system and ask them how many checks they issue.  "Checks?  What?" will be the response.  People from Singapore and Australia literally laugh in disbelief when I mention that the US still uses paper checks.

Wire transfers have been well established since the late 1800s and now provide same day international funds transfers, usually for a reasonable fixed fee.  Credit cards are a defacto payment method for a massive volume of transactions for many countries, and have benefits like fraud protection and points, and the merchant pays the fees for those transactions--which I am happy to do.

And services like the excellent TransferWise provide very low cost EFT funds transfers to dozens of countries with an excellent exchange rate.

The only reason I have to explain why North American consumers and businesses seem to cling to checks is because our backwards banking system does not (yet) charge fees to shuffle around millions of pieces of paper with ink on them, pay the postage to mail them, scan those papers into digital images, and then perform an electronic funds transfer behind the scenes.  But they do charge a fee if customers initiate a payment electronically through EFT / ACH or a wire transfer and no paper is involved.  It's crazy.

So, after wasting a few more hours researching this topic, I now have a clear decree, straight from the heart of Bank of America, and will continue to accept only credit card and wire transfer payments from non-US customers.  If it's good enough for the rest of the world, it's good enough for me.

You can also find him on Twitter, YouTube, and Google+