Thursday, August 6, 2009

Sending XML Files to eConnect

I have been working on an eConnect integration that needs to import up to 40,000 sales transactions a day into Dynamics GP 9.

The challenge is that the data is coming from an off-site eCommerce web site, so we had to export the data from the web site so that we could transfer the data for import to GP.

After considering the pros and cons of CSV and XML, we chose XML because we thought that it would be easier to use XML to handle the header, lines, taxes, and payments in one file rather than having to deal with multiple CSV files. In this respect, we were right--it is easier to have one XML file to process vs. four CSV files. But in other respects, most of which I didn't anticipate, sending XML files to eConnect has some challenges.

When the client offered to prepare and export the XML files for me in the eConnect format, I thought it was great, since it would save me time--I would only need to read the XML file and just pass it over to eConnect. Simple, right? Famous last words.

It turns out, that for this project, importing XML files was moderately complex and posed some challenges.

My first challenge was trying to validate the structure of the XML file, which the client was creating. Normally the XML would be created by the eConnect .NET Serialization assemblies, in which case you never have to worry about the physical format of the XML file. But when the XML is being created by another system, I thought it would make sense to try and validate the format before import.

I thought this would be pretty straightforward, since eConnect provides XSD files that you can use to verify your XML files. After hours of trying to get the XSD validation to work, and a few posts to the great gurus at Experts Exchange, I realized that the XSD validation process was surprisingly impractical and was useless for my project. The eConnect XSD files utilize a few options that make the validation so rigid that I was unable to use it. And more surprising, because of the way XSD validation is performed, the process only returns one error at a time. In XML files with 2,000 to 4,000 transactions each , validating and fixing the XML file format one error at a time was impossible. I ended up just sending the XML file to eConnect and using those more informative errors to get the XML file format correct.

Next, I had to be absolutely sure that eConnect would accept and process XML files with multiple transactions. Processing 40,000 transactions a day using individual transation files was completely impractical, so we really needed to be able to send at lesat several hundred transactions in each file. I thought it would be possible, but with eConnect, until I see it working, I don't make any assumptions. And I also didn't know how sending large files to eConnect would affect the performance of the integration. Fortunately, once we worked out all of the kinks in the XML format, I was able to send a single XML file with 2,000 transactions to eConnect and all of the invoices were created successfully. I have since been able to send a single SOP invoice file with 4,700 transactions and have it import successfully. (see caveat below)

And of course, there is data validation. When you send a single file with thousands of transactions to eConnect, a single data error in that file will cause the entire file import to fail. So it is critical to validate the file thoroughly before attempting to import it, checking most of the node values to confirm that they will not cause a failure. I am validating customer IDs, item numbers, item types, SOP types, transaction dates, document subtotals vs. line totals, and tax IDs. I also have to make sure that the items have valid price lists, item site records exist, and that the ship method is valid. Fortunately this is easy to do with XML and the validation is fairly fast.

And then came the dreaded eConnect "System Error" messages. I haven't run into it this error yet with eConnect 10, but it is definitely an issue in GP 9. A System Error is returned by eConnect for various reasons, but unfortunately eConnect doesn't provide any additional information, so you have to determine the cause through guessing and trial and error. The most common reason for the eConnect System Error is that an XML node is missing. In some cases, nodes are only required under certain circumstances, so although the eConnect help file says that the node is optional, it may actually be required for your particular transaction scenario. So after hours of testing and troubleshooting, we were finally able to identify the cause of the System Errors. In some cases, a node was missing, and in other cases, it was because some of the nodes were not in the exact order that eConnect exptected (XSD theoretically checks for this, but as I described, it just isn't practical).

So all is well and I'm able to process the large XML files on my development server. But when we deployed to the client's server, we received System Errors yet again. We are able to process small files (with 250 transactions) on the client server, but the file with 4,700 transactions fails. I guessed it was because of some slight configuration difference between the client's GP environment and mine, and that one or more transactions were causing the error.

So I wrote a new import routine that would take the single XML file and import each transaction individually. This way we could find the individual transactions that were causing the failure. Once we installed the new integration and tested it, to our surprise, all 4,700 individual transactions imported fine. Our only conclusion is that there must be something on the client's server that caused the large files to fail--a problem that I don't have on my development server. We don't yet know how many transactions we can include in a file, but we will be trying different file sizes to see what will work consistently.

Finally, while doing live imports, I discovered a new quirk of eConnect 9 that surprised me. After importing SOP invoices, we were importing SOP Payments (taCreateSopPaymentInsertRecord) from a separate file. My validation routine detected that there was one error in a file, but somehow the entire file imported successfully. A payment for an invoice that did not exist in GP was not rejected by eConnect. I then created a test payment XML file with a fake customer and fake invoice number. Sure enough, eConnect procesed it without returning any errors--but of course the payment record did not appear in GP. Clearly eConnect 9 has a hole here.

So that's a summary of my adventures. Now that I've been through it, if I had a choice, I would generally not recommend trying to create XML files outside of GP and send them to eConnect. It's tedious, time consuming, and error prone to reverse engineer the eConnect XML format. It is so much easier to use the .NET eConnect Serialization assemblies than to debug and validate XML files. But as this project demonstrated, sometimes the choices are limited, and you just have to go off-roading and figure out new solutions.

Saturday, August 1, 2009

Getting to Know and Love the SQL Server Browser Service

A while ago I wrote about some problems I had logging into GP that were caused because the SQL Server browser service was not running.

I'd like to (belatedly) follow up with an explanation of the history behind the SQL Server Browser service, and why you should make sure it is running on a Dynamics GP SQL 2005 server.

Back in the days of SQL Server 7.5, you could only have one "instance" of SQL Server on a machine. That single SQL server listened for connections on port 1433. Life was simple, and all was peaceful. (except for the single instance limitation)

With the release of SQL Server 2000, you could install multiple SQL instances on the same machine, effectively allowing you to host multiple SQL Servers on one physical box. This was great, but there was one problem. Two or more SQL instances couldn't all listen on port 1433. So a new protocol and dynamic port design was developed that would listen on port 1434, and serve as a directory of sorts, telling clients the names and port numbers of SQL instances installed on the machine.

If you've ever had to troubleshoot obscure connectivity issues with SQL 2000, you probably know that it can be frustrating to troubleshoot down to the port level, since you could never assume what port a given SQL instance might be using. And there were also some challenges related to configuring the client communication protocols, causing headaches determining if named pipes or TCP/IP wasn't working properly.

Well, with SQL Server 2005, a new solution was introduced to help make life easier with regard to SQL Server ports and client connectivity. One element of that solution is the SQL Browser service.

The SQL Browser service helps clients identify and connect to SQL instances, and SQL Server Configuration Manager allows you to specify client connection and protocol settings on the SQL Server, rather than having to configure each client PC individually. When used with the SQL Native Client Driver on workstations, this is a nice improvement, reducing installation and administration effort.

Microsoft MSDN has a nice page that briefly covers the background I just described, and provides some additional details on the mechanics and operation of the SQL Browser service.

But, alas, the Achilles Heel of all of this is that the SQL Browser service may not be configured to start automatically when SQL Server is installed. So if you run into obscure problems and spend an hour or two scratching your head wondering why GP won't connect, make sure to add the SQL Browser service to your troubleshooting checklist.

Dynamics GP Window Is Not Visible or Is Hidden

Quick post to share a quirk that I ran into recently. Nothing complex, but it was annoying.

I had been using GP 10 on a remote Terminal Server, and had minimized the main GP 10 window before I disconnected my session. When I logged back into the server, GP would not maximize or resize--it just stayed in the task bar but would not display.

I thought it might be temporary, so I tried closing it, and then relaunching it, with no luck. Even logging out, and logging back in didn't work.

I knew there was a config file storing the window location, but it has been several years since I last had this issue.

The setting is in the Dex.ini file, which for GP 10 is now located in the Data subdirectory for GP.

If you search for the WindowPosX and WindowPosY values, you will want to set those to a reasonable positive values, like 100:

WindowPosX=100
WindowPosY=100

In my case, the values had both been changed to -32000 for some reason, which explains why I couldn't see GP!

Once I set them both to 100 and saved the Dex.ini, GP launched fine.

Monday, July 27, 2009

Kits and BOMs and Components, Oh My!

It all seems simple enough when we look at the basic definition of a Kit and a BOM (Bill of Materials) in Dynamics GP.
  • A Kit is a group of items grouped at the time of sale
  • A BOM is a group of items assembled in to a finished good prior to sale

Simple enough, right? I normally tell people that a Kit is a SALES concept, while a BOM is an INVENTORY concept. A Kit is pulled at the time you sell it, a BOM is something that happens before the Item exists to be sold (before it is in Inventory).

Simple as it may seem, this is frequent subject of debate in discovery sessions and even among myself and coworkers, as both of these functions can be useful in different situations, leading us to use them in non-traditional ways. This week, I was part of one of those discussions regarding which feature was most appropriate for the situation. And then later this week I had another series of emails with another client debating the same thing. Practically speaking, they both sell Kits. However, they currently use BOM functionality and take advantage of some of the benefits of BOMs that support their business requirements (e.g., control over components, serial number linking, etc). However, they are also subject to the negative aspects of BOMs as well (e.g., less visibility to components for reporting, need for prior assembly, etc). So, what to do?

Well, it seems to come down to which one is the BEST fit given the requirements. So I like to break it down in to the pros and cons of each.

Bill of Materials (BOM)
Pros:
  • Control over what is included in the BOM, avoiding substitutions that could affect currency amount pricing
  • Ability to serialize the top level of the BOM, and link back the component serial numbers
  • Ability to track kits that are sometimes pre-assembled (grouped together for quicker picking/packing/shipping) during slow periods
  • Quantities tracked at finished good level (this could be a Pro or a Con, depending on your needs)
Cons:
  • Have to assemble prior to sale (unless you look at the Blue Moon add on for SOP to BOM linking)
  • No flexibility on sales transaction to manipulate configuration/components
  • Less visibility to the components that make up the item being sold, specifically when dealing with serial numbered components (yes, there is visibility, just not as much as with a kit...my subjective opinion)
  • Packing slips, picking tickets, and other documents show finished good only (not components)
  • Difficulty if the return of a component is allowed, since the component is not listed on invoice

Kits

Pros:

  • Flexibility during sales transaction entry to substitute items, change configuration, without setting up additional finished items/BOMs
  • Not technically "assembled" , so no need to enter a separate transaction to assembly the Kit
  • Packing Slips, Picking Tickets, and other documents can print the Kit item or the Kit item including components
  • All quantities tracked at component level
  • High visibility to components sold (in my opinion, since the components are available, including serial numbers, on the document itself)
  • Ability to base Cost of Goods Sold account on the Kit or the Kit components

Cons

  • Only has ability to serialize the components, not the top level Kit
  • No ability to track pre-assembled quantities
  • Lack of control over changes made during data entry, including substitutions which should impact currency pricing

In my mind, there is no "right" answer, and one method is not necessarily better than the other. But, one method may very well meet the specific client requirements better. And, of course, there are plenty of companies that use both (in a clearly delineated way). So the rule of thumb that I try to use is to make sure that the requirements and process are discussed thoroughly, and that the client understands the pros and cons of each approach.

Please share your own thoughts about the pros and cons of Kits and BOMs, I would love to update this post with any additional comments :)

Friday, July 24, 2009

Simple solutions: Keeping the ego in check

I tend to think that I am the sort of curious person who comes up with creative solutions to complicated problems. And in that way, I am somewhat proud of my troubleshooting abilities. Give me some time, peace, and quiet and I can figure out an issue through my powerful logic and reasoning capabilities--if you didn't catch the sarcasm in that statement, it was dripping with it :)

Today, I came across one of those issues that had me stumped. I assumed it had to be a problem report, after all I couldn't figure it out (again, sarcasm here). So here was the issue...
  1. Client has used project accounting since 1/1/2009, there has been no change to their fiscal period setup
  2. I needed to export project periodic balances for reconciliation
  3. I ran the project utilities (Utilities>>Project>>Reconcile Periodic, Recreate Periodic)
  4. I checked the results in the PA01304 and there are records, but no balances at all
  5. I checked the actual project data in GP, and there are detail transactions and project balances but no periodic balances
  6. I searched the knowledge base on PartnerSource
  7. No luck
  8. Tried PA check links
  9. Tried more PA reconciles
  10. Tried in other companies, same results
  11. Tried on my system, different results
  12. Hmmm

So, I started a case, knowing that there have been past issues with the reconciles and thinking that maybe there was a known issue I was not finding in my search. Maybe, it's a service pack issue but I really wanted to avoid having to apply service packs just to get the data we needed. And, maybe, if I am honest with myself, I was being a bit lazy about it all.

So Brady at Microsoft has me check the period setup table for project accounting entries, select * from SY40100 where SERIES = 7. And, lo and behold, it returns nothing for project for the year 2009. So, he has me go in to Fiscal Period Setup (Tools>>Setup>>Company>>Fiscal Periods) click Calculate and Save to create the records.

I run the reconciles again, and I have data! I had faith the periodic data would return, because the detail was still there but for it to be such a simple fix was both good and bad. Good because it was quick to resolve, bad because I didn't get there myself :)

So I thought I would share this little nugget with you, just in case you run across this specific issue. And even if you don't, maybe it will remind you of the fact that most issues have simple causes and resolutions...it's just a matter of finding them.

Have a great weekend!

Monday, July 20, 2009

Non traditional uses of existing products

A longtime client of ours wanted an online expense report system. Of course, I trotted out the normal list of third party products as well as eExpense from Concur. They had already been through a demo of eExpense with their payroll provider and had experienced, how shall I say it, sticker shock. The conversation went something like this...

Me: eExpense is a really great product, lots of functionality.
Client: Yeah, but we really just need a basic product with basic functionality. And the price tag just is not realistic for our need, we can't justify it.
Me: Let me see what else I can find that might be a compromise.

I did some searching and found most of the other third parties fall in about the same general price range. Some were a bit cheaper, but still had monthly subscription costs. So what to do? Then, I started to think creatively. The client already owned Project Accounting, and used it in one of their companies. They did not need to capture expenses by project, but...

Why not implement PS Time and Expense with some basic project setup, and use it to meet their needs? Now, this client was not on Business Ready Licensing, but if they were the savings would be even better! Bottom line is that in a little over two days of consulting time, and with less than $2000 in licensing costs for PS time and expense with user licenses, they have an expense reporting system.

We did a quick configuration of project accounting, and documented the setup of projects and cost categories (which will be the bulk of the additional setup they may expand in the future). The trick was to keep it simple, and not worry about full blown training on project accounting which was not needed. And the interesting part is that the simplicity is what the client likes (which is sometimes a complaint among full project accounting users).

Just thought I would share this as a reminder (one that I need from time to time) that the solution to a complex issue may lay right under your nose, and that sometimes simple is best. I think we as consultants inadvertently lean towards the "coolest" or the most functional solution. But we must remember that the ultimate goal is to meet the client's current needs and anticipate their future needs but not overstate or oversell them.

Wednesday, July 1, 2009

Why it pays to read old quality reports

So, every once in a while, you stumble across an old quality report referenced in a knowledge base article. Usually I look at it, see it was resolved with a service pack to version 6, or 5.5, and I think "oh, well, that's not the problem". But today taught me that sometimes old quality reports will shed some light on a current situation.

So, here is the scoop. A client emails me asking why they see duplicate records in the RM30501 (Commission History table). They are paying commissions when invoice is posted (not when invoice paid). They see something like this in the table:

STDINV2599 with commission amount $100
STDINV2599 with commission amount $100
STDINV2599 with commission amount ($100)

Each entry has its own sequence number. So I do some testing. I enter an invoice, post it, void it, and see if I can come up with the same entries in the RM30501. No luck. So I told the client I needed to take a look on their system. We poke around for a bit, chasing one wild hair after another with no luck.

And then I start searching randomly in the knowledge base and I find an old article referencing an issue on version 6 that was resolved with a service pack. In the scenario they gave, if a payment was voided that was greater than the invoice amount, it would lead to negative commissions appearing on the commission report even if commissions where supposed to be paid when the invoice was posted (not when invoice was paid). Not quite the issue that started this, but promising enough. So here is what I tested:

1. Enter invoice in SOP and post
2. Pay invoice with payment greater than invoice amount (so a portion of the payment is unapplied) and post
3. Move invoice to history using Paid Transaction Removal (this step will move the commision record from the RM10501 to the RM30501)
4. Transfer commissions to mark the commission as paid
5. Void the payment recorded (this step will move the commission record back from the RM30501 to the RM10501, but more importantly this is where two additional records are created for the commission in the RM10501- one positive, one negative, so the net is correct but the detail is not)
6. Transfer commissions again, this time a negative commission will appear (even though we are paying commissions when posted not paid, so the voiding of an invoice should not matter)
7. Print Reports>>Sales>>Commission>>Commission Dist by Salesperson and you will see all three transactions for commissions: the original plus the two erroneous entries created by the void

Strange, huh? The commission distribution report is not a huge issue since it nets correctly, although it is troublesome that the detail is incorrect. A bigger issue is the transfer commissions journal, since that incorrectly shows a negative commission transferred and is often used as an entry list for paying commissions from payroll or payables.

Anyway, Microsoft is able to reproduce the issue, so we'll see what resolution they provide. I am a bit curious if the quality report came back after being resolved, which makes me wish I had a version 8 install to test on as well. I will definitely update this blog with any additional news. In the meantime, you may want to monitor closely if you use the transfer commissions journal to pay commissions. And definitely share any stories you have of long lost quality reports rearing their ugly heads :) I will share any posts I get.

Have a great holiday weekend!