Friday, August 29, 2014

Dynamics GP PM transaction with BCHSOURC = XXPM_Trxent and BACHNUMB = sa

By Steve Endow

A customer received an error when trying to use my AP Payment Apply Library to apply a manual payment to an open vendor invoice.  My library said that the invoice was not found, but the customer sent me a screen shot of the invoice listed in the Payables Transaction Inquiry window.

It looked like this:


That seemed odd.  The invoice was obviously present.  We had the right vendor and the right invoice number.  But my apply library couldn't apply a manual payment to the invoice.

I asked the client to query the invoice record in PM20000, with a query like this.

SELECT * FROM PM20000 WHERE DOCTYPE = 1 AND VENDORID = 'ALLENSON0001' AND DOCNUMBR = 'C100'


Sure enough, the record was there and looked normal.  But I looked again at the data, and found two odd values.


Notice that the batch number is "sa", and the batch source is XXPM_Trxent.

What in the world does that mean?

I Googled "XXPM_Trxent", but only found the DynDeveloper reference on PM10300, which only said "Real time", which doesn't mean anything to me.

So I then searched the TWO database on my development server and behold, I had one Payables invoice with the same symptoms.

I then inquired on that open invoice in GP and saw the issue.  The invoice was in a foreign currency.


It would seem that those two field values indicate a multi-currency transaction.  I don't know the details of why those values are set differently, but this is relevant to applying AP payments because Dynamics GP does not allow you to apply an unposted AP payment to an open invoice if multi-currency is involved.

If you attempt to apply an AP payment to an invoice with a different currency, you get this message.


You have to first post the payment, and you can then apply the open payment to the open invoice, and the multi-currency calculations and related transactions can be processed by GP.

So it appears that I need to add an additional validation rule to my AP Apply Library to check for this situation and provide a specific message for it.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





If you give a Dynamics GP consultant a new NAS...

By Steve Endow

There was once a Dynamics GP consultant who had a file server in his Global Data Center of his Global Corporate Headquarters.  It was a fine file server, but having worked hard for many years, it started to show warning signs of imminent failure.  Since the file server stored several terabytes of virtual machine backups, file backups, machine backups, and over a decade worth of MSDN and Dynamics GP software, it was pretty important.

So the consultant ordered a Synology DS1813+ NAS with six 4 TB drives to replace it.  And so the tale begins...

When the new Synology arrived, the consultant had to get it unboxed and install the drives.

He then had to make room in his server cabinet in the aforementioned Global Data Center for the new NAS and get it plugged in.

After getting the NAS configured and the drive array initialized, he wanted to enable the link aggregation on a few of the four network ports.  But his gigabit network switch didn't support link aggregation.

So he had to order a new switch.  When the switch arrived, he discovered that it was a previously returned product and someone had put a refurbished switch, of a completely different model, in the new switch box. (People seriously do that kind of thing??)  So he had to return the switch to Amazon and get a replacement.

When the new, proper, switch arrived two days later (Go Prime!), he swapped out the switch in his server cabinet and got everything re-wired.

Now that the new switch was hooked up, he had to configure it.  Late on a weeknight, with the useless switch documentation, he spent several hours trying to figure out how to enable link aggregation.  After losing all network connectivity a few times, he finally found a friendly blog post that explained how to properly configure the switch to use link aggregation with the Synology NAS.  Victory.

With link aggregation enabled, he then had to push the terabytes of data from the old file server, move a few more terabytes of files from his desktop, reconfigure robocopy scripts, and change backup software to all point to the new NAS.  He briefly considered deleting a few terabytes of old files to make his life easier, but what fun would that be?

He was thrilled to do all of this fun stuff.  During his free time.  After 11pm on weekdays.

Though everything worked great, and since he had link aggregation on the NAS, he figured he might as well setup NIC teaming on his HyperV server to speed up VHD backups and file transfers.

So he then had to shop for a second server network card.  When it arrived two days later (Bam! Prime!), in between developing Dynamics GP integrations, he went to install it in the HyperV server.  While installing the network card, he noticed that the inside of the server had a nice blanket of dust.

Since the machine was pulled out and opened up, he figured he might as well clean out the dust.  So he rolled out his air compressor and fired it up.

After unplugging the server, setting it outside, and donning a P100 respirator, he blew out the dust, creating an epic dust storm that contributed noticeably to the Los Angeles air pollution index.

With that done, he put the server back and plugged in all of the cables, crossing his fingers that nothing had been fried in the process (been there, done that).  He gave thanks to the computer gods when the server booted up just fine.

Now that the second NIC is installed, he still has to configure NIC teaming and setup another link aggregation "LAG" on the switch.

Once that is done, he should be able to sit back, watch data fly across his network, and...

...actually get some REAL work done.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


 

Wednesday, August 20, 2014

The difficulties of synchronous integrations with Dynamics GP

By Steve Endow

Let's say that you have a custom "operational" system that runs much of your business.  Let's say it's a transactional system that is specific to your industry--suppose it's an Auto Insurance Policy management system.

So this system manages all of the crazy rules and regulations and requirements for quoting and issuing and managing auto insurance in multiple states, and it also issues invoices.  You have hundreds of insurance agents that sell your insurance in multiple states, so the agents are your customers, and you may invoice them for hundreds or thousands of customer policies per month.

When you issue invoices from the auto insurance system, you want the invoices automatically created in Dynamics GP.  So after you prepare the invoices for the agent in Atlanta, you want to be able to click a button in your operational system that sends all of his invoices to GP.

Your operational system was developed using a non-Microsoft development platform, so you can't directly invoke eConnect, and it can't communicate via web services.  But it can call a COM component.  So great, you create a COM visible .NET assembly that reads data from a database and imports transactions into Dynamics GP via eConnect.  Works great, and you are happy!  You click a button and an invoice magically appears in GP, instantly!  Yay!

This sounds good, except when you use it with real data.  What if that agent in Atlanta has 2,000 invoices?  The import into Dynamics GP will certainly take more than 30 seconds, and could take a few minutes.  So if you click on that magic button, your operational system may hang for several minutes as it waits for a response from the COM component.  Not a good experience for users.

And what if there is an error with 10 invoices and they fail to import into Dynamics GP?  Now you have 1,990 invoices in GP, but you are missing 10.  How do you handle that?  Do you then route the user to some additional window where they can review and potentially correct the issues?  Do you add some additional code to re-import the 10 that failed?  What if the error is due to a missing GL account or Class ID or Item in GP that will take some time to setup?  Is your operational system really the place for users to deal with the integration issues directly?

What if GP is offline or there is an issue that prevents the import from running?  You then get an error when you click on the magic button.

In short, it can get messy.

This scenario is what I call a "synchronous" system integration--sometimes people call it a "real time" integration.  The operational system is attempting to 'directly' import data into Dynamics GP and is waiting for the import to finish so that it can receive a response.  For some simple situations, this might work well--say creating customers or vendors or inventory items.  But for transaction imports, and especially high volume imports, it can become problematic.

I therefore recommend "asynchronous" system integrations.  Usually, this involves some type of data repository that acts as a queue.  The operational system might export a CSV file every hour, or perhaps a CSV file for each batch of invoices, or a file for each customer that needs to be invoiced.  The operational system could also write records to a staging table in SQL Server.

A separate process is then run to import the operational data into Dynamics GP.  It can be scheduled using Task Scheduler, or can be triggered via command line, or it could be user initiated through an EXE or GP AddIn.

This approach decouples the two systems and allows them to operate completely independently, with the integration being a discrete system and process that allows the operational system to interface with Dynamics GP.

One downside to asynchronous integrations is that they aren't "real time".  If you schedule the integration to run every 5 minutes, there is a 5 minute delay between the invoices being created in the operational system and them being imported into GP.  In my experience, there are very few customers that need integrations to run more frequently than every few minutes. (A common exception is with inventory--some customers need receipts and transfers processed immediately to complete a subsequent transaction with that inventory)  And even if you think you need something imported into GP every 30 seconds, don't forget that you have to post transaction batches in GP--so that has to be done manually or using a batch posting tool like Post Master Enterprise, another process which takes some time depending on the batch size.  So "real time" is a misnomer at best.

Yes, you still have to deal with errors and reprocessing failed transactions with asynchronous integrations, but they no longer hold up your operational system, and you have to deal with those errors in either case.

I recently when down this path with a customer, but they had to experience it themselves to understand the complications and caveats of the synchronous approach.  I am now providing a new version of the integration that will allow them to trigger the GP import, which will run asynchronously.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Leveraging Extended Pricing For Multiple "Percent Of" Methods

For me, extended pricing is normally brought in to the conversation when there is a need for date specific/promotion based pricing.  However, I had a co-worker ask about a scenario recently that made me consider extended pricing in a new way.

In the scenario, the client needed the following price structure:
  • Flat Retail Price
  • Flat Wholesale Price
  • Discounts based on Wholesale Price (% Off)
They did not want to maintain all three of these items as flat amounts (which would be needed if using standard Dynamics GP pricing, since you can only have on price "method" per item price list).

So, to accomplish this in Extended Pricing, here is what we did...
  • Establish the Base Price Book to contain the Wholesale Prices of items
    • The Price Type in this case would be Net Price since it is a fixed dollar amount
    • This will allow it to be treated as the List Price of the item
  • Establish a Wholesale Price Sheet
    • The Price Type in this case would be Percent of List as 100%, and it would be set to Base Adjusted Price on- Base Price Book
    • This may seem redundant, but I thought it would make for better visibility in terms of what is assigned to customers and items in terms of pricing
  • Establish a Retail Price Sheet
    • The Price Type in this case would be Net Price, with the fixed dollar amount
  • Establish a Discount Price Sheet(s)
    • The Price Type in this case would be Percent of List, with Base Adjusted Price On Base Price Book
In this case, the client could update the Base Price Book and automatically update any number of discount price sheets automatically. 

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.



Thursday, August 14, 2014

Checking the number of SQL Server connections for SQL connection "Timeout period elapsed" error

By Steve Endow

 I recently had to troubleshoot an issue with a Dynamics GP third party product that would hang with the error:
Timeout expired.  The timeout period elapsed prior to obtaining a connection from the pool.  This may have occurred because all pooled connections were in use and max pool size was reached.



Please note the phrase "prior to obtaining a connection from the pool".  This is not a command timeout, it is a connection timeout due to a lack of available connections in the application's connection pool.

To confirm this problem, I ran the following SQL query that I found on some helpful forum post (don't recall where):

SELECT
    DB_NAME(dbid) as DBName,
    COUNT(dbid) as NumberOfConnections,
    loginame as LoginName
FROM
    sys.sysprocesses
WHERE
    dbid > 0
GROUP BY
    dbid, loginame

This query produces a very nice result set that allows you to quickly and easily see active SQL connections.

In my case, the third party application was opening 100 connections to its database, so if you tried to process more than 100 records, it would hang and display the connection timeout error above.

This problem occurred because the application was not properly closing its connection as it looped through hundreds of records.

The query helped me quickly confirm the issue and point the developers to the source of the error.


UPDATE: Please check out David Musgrave's related post about how this problem can occur with VBA modifications in Dynamics GP:

http://blogs.msdn.com/b/developingfordynamicsgp/archive/2014/04/23/more-on-sql-server-connection-issues-with-microsoft-dynamics-gp.aspx



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Wednesday, August 13, 2014

Microsoft eliminates exams and certifications for Dynamics GP, NAV, SL, and RMS

By Steve Endow

Two weeks ago I was coordinating with a Dynamics GP partner to update my Dynamics GP certifications in conjunction with them renewing their MPN certification requirements.

But when one of the consultants tried to schedule an exam on the Prometric web site, the web site would not let her complete the registration process, saying that only vouchers could be used to pay for the Dynamics GP exam.

At the same time, we found some discrepancies on the Microsoft web site regarding the Dynamics GP certification requirements and MPN certification requirements, so the partner inquired with Microsoft.  We were told that some changes were being made to the certification requirements, and that the updates would be communicated shortly.

Today we received an official announcement that Microsoft is eliminating the exams and certifications for most, but not all, of the Dynamics products.
We are announcing the elimination of certification/exam requirements for Microsoft Dynamics GP, Microsoft Dynamics NAV, Microsoft Dynamics SL, Microsoft Dynamics RMS and Microsoft Dynamics C5 effective August 13th. This includes pre-sales and sales assessments as well as implementation methodology and technical certifications covering both SPA and MPN.
Microsoft Dynamics AX and Microsoft Dynamics CRM are not impacted by this announcement.  Assessment and certification requirements remain for these product lines. 
A brief version of the announcement is posted on this Partnersource page:

https://mbs.microsoft.com/partnersource/northamerica/readiness-training/readiness-training-news/MSDexamreqchanges


I have mixed feelings about this change.  A few years ago, Microsoft made such a big push for requiring Dynamics GP certification, and requiring all partners to have several certified consultants on staff.  That produced dramatic changes in the partner channel that affected a lot of people.  Eventually things settled down and the exams and certifications became routine.

This announcement appears to be a 180 degree shift from that prior strategy, and completely abandons exams and certifications.  While this may open the market back up to smaller partners, I now wonder if consulting quality may decrease as a result.  But this assumes that the exams and certifications mattered and actually improved consulting quality--I don't know how we could measure or assess that.

So, welcome to a new phase in the Microsoft Dynamics strategy.

What do you think?  Is this a good thing?  A bad thing?  Neither?

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Back to Basics- ODBC connections and third party products

We had an oddity a couple of months ago with a relatively well known third party product.  Here were the symptoms...

  1. User could run a billing generation process when logged in on the server
  2. User could not run process when logged in to their local install
  3. SA could not run process when logged in to their local install
  4. Set up new user, still could not run process
  5. Happened from multiple workstations
  6. Spent hours and hours on phone with third party product, as it was perceived to be a code issue
It seemed like we were just going to have reinstall the workstation.  But even that didn't make a lot of sense because we had tried copying over dictionaries, and reinstalling the third party product to no avail.  So, duh, what could it be?  The ODBC connection.  We did not do the installation of the workstations, and it appeared that the ODBC was most likely set up manually and there were settings marked that are not recommended.  Once we corrected the ODBC connection, all was good and right with the world.  So another reminder that the weirdest issues normally have the simplest explanations.

Here's a blog post on how to set up an ODBC connection properly for GP (or let the installation automatically create the ODBC connection to avoid issues):
https://community.dynamics.com/gp/b/azurecurve/archive/2012/09/25/how-to-create-an-odbc-for-microsoft-dynamics-gp-2010.aspx

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Management Reporter- Columns Not Shading Properly

Sometimes the devil is in the details.  Honestly, it seems like ALWAYS the devil is in the details.  There are quite a few community and blog posts floating around regarding column shading in Management Reporter.  So I thought I would summarize my recent experience with this, as it is/was a little trickier than expected.

First, yes, it is really is as easy as highlight the column in the column format and applying shading.

 
After you do that, yes, it looks funny (meaning the whole column does not appear shaded).  But that is okay, and NORMAL.

 
Now, here is where it gets tricky.  If you have applied ANY formatting to your row (including indenting, highlighting, font change, etc), then the shading will not take effect.  This is identified as a quality report with Microsoft, #352181. And unfortunately, there is no scheduled fix date yet.  So the only workaround is to recreate the row format without the additional formatting.
 
Results when indenting is applied on the row format...note the rows that are indented are not shaded properly.
 
 
Row format with indenting...
 
 
 
So it seems, for now, you have to choose between formatting (and I mean ANY formatting, not just shading) on the row format -vs- shading on the column format based on your needs.
 
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

 
 


Management Reporter- Date Not Printing Correctly

You may have noticed that the default, in recent versions of Management Reporter, is for reports to open in the web viewer rather than the Report Viewer itself.  Of course, you can change this setting (so that it defaults to the Report Viewer instead) in Report Designer, using Tools>>Options.

You may have noticed though, that the auto text for date (@DateLong) on a header or footer, returns a different result when opening the report in the web viewer or Report Viewer.


When opening a report in the web viewer, the date/time settings (that impact the format of the date presentation) will be pulled from the service account that runs the Management Reporter Process service.  But when you generate a report and it opens in the Report Viewer, it will use your local user account's date/time settings.

So, if you plan to use the web viewer extensively you will want to make sure that the date/time settings are set in the profile of the account that runs the process service.  The default time format, generally includes the day of the week- this is often the element that users want to remove so that the report does not say "Tuesday March 12th, 2019".

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, August 8, 2014

Two gotchas when developing a COM component using C# and .NET 4

By Steve Endow

(Warning: This post is full-developer geek-out technical.  Unless C#, COM, default constructors, and RegAsm are in your vocabulary, you may want to save this one for a future case of insomnia.)


In the last year, I've had two customers request Dynamics GP integrations packaged up as COM DLLs so that their internal applications can call the DLL to import transactions into GP.

The benefit of this approach is that their internal application can control exactly when and how data is imported in GP.  They don't have to wait for a scheduled import or some other external process to trigger the import.

The customer's internal application is creating the data, so they want to start the import as soon as the data is ready, call the import, and have their application verify that everything imported successfully.  This process can produce a very tightly integrated and streamlined process.

Not a problem.  For applications that are developed using something other than .NET, many support the older COM software interface standard.  Fortunately, .NET makes it relatively easy to expose .NET assemblies as COM components.  With newer versions of Visual Studio, it is just a single checkbox.


Very simple, right?

I previously developed a COM-Visible .NET DLL for a customer and that's all it took.  But one difference for that project is that the customer asked that I use Visual Basic.  With the Visual Basic DLL, the DLL registered fine and I had zero COM issues, somewhat to my surprise.

But this time, when my customer tried to register the C# DLL as a COM component, they received this error.

RegAsm : error RA0000 : Failed to load 'C:\path\import.dll' because it is not a valid .NET assembly

So, what was different?  Why was the regasm command failing?

Well, after some puzzling, I realized that two things were different this time.  First, I realized that I developed this DLL using .NET 4.0, whereas the last one was .NET 3.5.  While that may not seem like a big deal, I now know that it makes all the difference.  It turns out that with .NET 4, there is a different version of RegAsm.

So instead of this path for RegAsm.exe

       C:\Windows\Microsoft.NET\Framework\v2.0.50727

You need to use this path with .NET 4:

       C:\Windows\Microsoft.NET\Framework\v4.0.30319


NOTE:  If your library is compiled for x64, you need to use the Framework64 directory:

       C:\Windows\Microsoft.NET\Framework64\v2.0.50727


Okay, so now that I had figured that out, and used the .NET 4 version of RegAsm, I received a different error.

RegAsm : warning RA0000 : No types were registered

Huh.  I tried a few different options for RegAsm, but just couldn't get it to recognize the COM interface.

After some moderately deep Googling, I found a helpful post on Stack Overflow.  The solution poster noted that with C#, you need to have a default constructor for COM.  Um, okay, so...wait a minute, what?  Huh?

And that's when I remembered.  My prior COM project used VB, whereas this one was C#.  Well, one of the things I learned during my transition from VB to C# was that VB essentially hides constructors from you.  And looking at my VB code, it looks like I did a fair amount of digging and added the "Public Sub New()" method, which is required for a COM visible class.

With C#, it is similar, but the syntax is slightly different.

    public class Import
    {

        //Default constructor needed for COM
        public Import()
        {
        }


You just have to add an empty default constructor for the public class in order to make it COM visible.

Obvious, right?  Not so much.

Now that I know, it's no big deal, but when I'm asked to do another COM visible component in a few years, I'm sure I'll forget.  Hence this blog post!

Anyway, after adding the default constructor and recompiling, the DLL registered just fine.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter








Thursday, August 7, 2014

Useless Dynamics GP Error Messages, Episode #47

By Steve Endow


It's been quite a while since I've had to import package files into Dynamics GP, so when I got this error today when trying to import a large package file into GP 2013, it puzzled me for a few minutes.


Error:  "VBA cannot be initialized. Cannot import this package because it contains VBA components."

If you search for this error, you will likely come across a Microsoft KB article saying that this error means that your VBA install is damaged and you have to fix it.  Um, no, I don't think that's the issue, since I have three GP installs on this server, and all of the VBA customizations work fine, and I can import packages into those other GP installs.

So something else was up.

Thanks to the prolific Dynamics GP Community Forum, I found this thread from 2010 where a clever user recommended checking to see if Modifier was registered in GP.  Hmmmm.

https://community.dynamics.com/gp/f/32/p/31214/72654.aspx

Sure enough, this was a new GP 2013 R2 install, and I hadn't yet entered reg keys.  Once I entered the license keys and saved them, the package imported fine.

What a useless error message.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Wednesday, August 6, 2014

Importing Analytical Accounting data using eConnect

By Steve Endow

I was recently asked for assistance to troubleshoot an eConnect integration that involved importing Dynamics GP transactions with Analytical Accounting distributions.

The user was able to import the transaction, but the imported transaction did not have the associated Analytical Accounting data in GP.  There were a few mistakes in the import, and there were a few missing values that produced some misleading error messages from eConnect, so I thought I would briefly describe how to include Analytical Accounting data in an eConnect import.

First, make sure to fully review the information in the eConnect Programmer's Guide related to the taAnalyticsDistribution node.  There are a few misleading elements and one inconsistency that may trip you up.

Let's review a few key fields in the help file:

DOCNMBR:  This should be the Dynamics GP document number, such as the Journal Entry or Voucher Number.  In the case of a PM voucher, this should not be the vendor's invoice / document number.

DOCTYPE:  This is NOT the same numeric doc type value of the main transaction being imported.  If you scroll to the bottom of the taAnalyticsDistribution help page, you will see a list of AA DOCTYPE values that you should use depending on your main transaction type.


For example, if you are importing a PM Voucher, the value should be zero, which is different than the DOCTYPE=1 you would use in your taPMTransactionInsert node.


DistSequence:  If the GL accounts in your distributions are unique, this field is optional.  If you have duplicate accounts in your distribution, you must supply this value, which means you will likely also need to assign the DistSequence value in your transaction distributions so that you can know the values and pass them to AA.

ACTNUMST:  The GL account number for the distribution line related to this AA entry.  I don't understand why this value is marked as Not Required.  Obviously if you are sending in DistSequence, then ACTNUMST would be redundant, but I would recommend always populating this value.


As for the AA distribution values, you also need to ensure that you are passing in valid AA dimension and code values, that the combination is valid, and that the values can be used with the specified GL account.  This is one thing that can make importing AA data so tedious--if the customer is regularly adding AA codes or GL accounts, an import may fail if everything isn't maintained and updated in GP.  I validate all of the AA data prior to importing, as it is much easier for the user than getting eConnect errors.


Here is a snippet of the AA code from a GL JE import I developed.  As long as you properly assign the handful of AA values, it should be fairly straightforward.

aaDist.DOCNMBR = trxJrnEntry.ToString();
aaDist.DOCTYPE = 0;  //0 = GL JE Transaction
aaDist.AMOUNT = Math.Abs(lineAmount);
aaDist.DistSequenceSpecified = true;
aaDist.DistSequence = lineSequence;
aaDist.ACTNUMST = lineAccountNumber;
aaDist.DistRef = lineDescription;
aaDist.NOTETEXT = longDescription;
aaDist.aaTrxDim = aaDimension;
aaDist.aaTrxDimCode = aaCode;
aaDists.Add(aaDist);


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter