Sunday, January 25, 2009
After a little bit of tedious coding to get all of the bank transaction types mapped correctly, the import worked well. Because we did not want the transactions to affect the GL, I made sure to turn of Post To and Post Through for both Bank Deposits and Bank Transactions. I then ran the import into the new production company, and we verified the counts and amounts, and all looked good.
But the next day the client noticed several unposted GL batches, with batch IDs corresponding to each of the companies that I imported. They appeared to be related to bank transactions, as the description for some of the JEs said "Bank Transaction", and had a type of CMTRX. After researching the transactions, I definitely confirmed that they were created by my eConnect import. But I had absolutely made sure to turn off Post To as well as Post Through!
So I did a test by turning off posting, and then manually entering a single bank transaction. As expected, no posting to GL, and no GL batch. Hmmm.
So then I re-ran the eConnect integration into a test company, making sure that Post To and Post Through were unchecked. Sure enough, a single batch with 99 JEs was created for the 99 bank transactions that were imported.
I then thought that it might be an implicit option based on the field values in my code. So I commented out the taBRBankTransactionHeader.BACHNUMB field in my code, thinking that maybe the fact that providing a batch number has eConnect create the GL batch regardless of the posting settings. I recompiled and ran the import into the test company again.
This time I got 99 GL batches! Without a batch number, eConnect reverts to posting each bank transaction separately, thereby creating a separate batch for each bank transaction.
So, from what I have observed:
1) eConnect will always create a GL batch for bank deposits and bank transactions, even if Post To is unchecked in the posting settings in GP. If you are doing a data migration using eConnect, be on the lookout for those batches so that they can be deleted.
2) eConnect provides you with the option of specifying a batch number for bank transactions, or not, depending on how you want the transactions to be posted in the GL. I can't imagine anyone wanting to have separate GL batches for each bank transaction, but I guess anything is possible.
Thursday, January 22, 2009
After restoring a copy of your production database into a test database, you should run the script offered in KB 871973 (PartnerSource login required) (which I call the 'refresh test company script'). This SQL script supposedly updates all references to the database name so that everything points to your test database.
Recently while working with a test database in GP 10, I found one database reference that is not updated by the 'refresh' script. It starts with an error while trying to create or retrieve an Employee record:
A save operation on table 'UPR_MSTR' cannot find the table.
When I traced the SQL that was causing the error, I tested it and got this error message:
Msg 208, Level 16, State 1, Procedure zDT_UPR00100U, Line 2
Invalid object name 'PROD.dbo.UPR00100'.
It took me a while for me to realize that "zDT" means it is a trigger, but once I figured that out, I went to the UPR00100 table, opened the triggers folder, and modified the zDT_UPR00100U trigger.
Sure enough, there were references in the trigger that included the old production database name (PROD.dbo.UPR00100).
I simply removed the PROD.dbo reference, executed the Alter script, and everything worked fine after that.
ALTER TRIGGER [dbo].[zDT_UPR00100U] ON [TEST].[dbo].[UPR00100]
AFTER UPDATE AS /* 10.0.320.0 */ set nocount on BEGIN UPDATE UPR00100
SET DEX_ROW_TS = GETUTCDATE() FROM UPR00100, inserted WHERE UPR00100.EMPLOYID = inserted.EMPLOYID END set nocount off
So if you have to work with employee records in a test environment, something else to remember!
Tuesday, January 20, 2009
Ever run into trouble because you can't tell what service pack or hotfix has been applied at a client site? Ever have a machine drive you nuts because of GP version issues? Ever try to create a test environment with copies of client databases only to get error after error like:
"You are attempting to configure an older version of product 2992. Database 10.0.1,061 Code 10.0.774.", or "Product 0 does not support upgrading from version 10.0.1237."
I used to print PDFs of every service pack web page on PartnerSource to try and have some record of the different versions (since 10.0.903 doesn't mean much to me), but that doesn't work for hotfixes and year end updates, which throw plenty of odd versions into the mix.
Finally, finally, finally Microsoft has answered my pleas. Today while trying to deal with such errors, I happened to notice that there is a shiny new comprehensive Dynamics GP 10 version list available in Excel format (PartnerSource login required). It is posted on the GP 10 service pack page, but only as a "Note", so I'm not sure how long it has been there and if I've missed it previously.
It lists every version number, for every module / dictionary, for every GP 10 release, from RTM to good old "KB 961354", aka SP3 with the 2nd version of the 2008 Year End and Canadian Payroll update!
With this list, I was able to immediately locate version 10.0.1237 and download the lasted version of SP3 with the year end updates, thereby solving my version errors.
It truly is a Happy New Year.
(And thanks to the poor soul who had to compile all of those version numbers! Nice job!)
UPDATE: Doug Pitcher made the same discovery a few weeks before I did, and posted on his blog that the full version list for GP 9 is also available. Thanks for the tip!
I recently made a copy of a base VM with SQL Server, Visual Studio, and Office, but without GP. I then used Sysinternals NewSid on the VM to change the machine name and SID. After a reboot, I installed GP and attempted to run GP Utilities.
GP Utilities displayed several errors that didn't tell me much, but then I finally received a message that referenced "DEV1\GP10". DEV1 is the name of my base VM that I had copied. Even though I had renamed the machine, GP utilities was still picking up the old machine name for the SQL Server.
This is a common issue that comes up when you change the name of a SQL Server. Even though the new name appears in SQL Management Studio, any SQL statements that use the @@servername parameter will return the old SQL Server name. (select @@servername)
I have dealt with this before, but I always forget the command to update the SQL Server name. Here for posterity are the simple commands to perform the update (example assumes an instance name of GP10):
Great, I thought, all done with that, and ahead with GP Utilities. But, to my surprise, I then received a new message indicating that the server had NO name at all, which was a new one for me.
After Googling a bit more, I found a tip that you need to restart the SQL Server service for a change to the 'local' server name to take effect.
After checking Books Online, I see that this caveat is mentioned under sp_addserver, but it is easy to miss.
Thursday, January 8, 2009
I completely understand the change from an inventory perspective, since it has always been a difficult task to reconcile inventory to the general ledger when distributions could be modified in summary, therefore "breaking" any association between the line items and the resulting distributions. But, for the many clients that use the POP module without using inventory, this change has created a more complicated process.
At first, the majority of the complaints we received about it were related to the fact that it is more time consuming to access the Item Detail expansion than to simply edit the distributions in summary. Annoying, yes. But, yesterday, I had a client with an even more exasperating situation. In their case, they have been routinely coding single items (non inventory, of course) to multiple accounts. So, their PO may have one item, a service contract as an example, that needs to post to a variety of accounts. And, of course, it is not a fixed thing that could be addressed with an allocation account.
There is a workaround provided by Microsoft, but of course it is not ideal as it creates extraneous debit/credit transactions in the GL for non inventory items:
For non inventory items:
- In Purchasing Distribution Entry, create a third distribution line with the same PURCH account but using an OTHER distribution type. Use this line to "wash" the original PURCH distribution.
- Enter desired distributions using the OTHER distribution type.
For inventory items, I think the complaints are less frequent because most users do have items posting to default accounts, but there are workarounds per Microsoft there as well.
For inventory items (three different options):
- Use fixed or variable allocation accounts
- Receive quantities on multiple shipments, changing account on Purchase Order as needed
- Change account on Item Maintenance or in Posting Account Setup to the appropriate default account
So, what to do? I would love to hear your thoughts, and the reactions you have gotten on this change (as necessary as it may have been from an inventory perspective).
UPDATE: Brenner from Willoware just let me know that they have included a fix for this in their GP PowerPack, www.WilloWare.com. "Not only will it allow you to again edit the PURCH distribution, but it also has a new "default" option which will calculate one distribution line per line-item received. " Thanks Brenner!
Tuesday, January 6, 2009
2. Hedge your bets, make a backup of the reports.dic before applying GP 10 SP3/year end update and then rebuild the reports dictionary after applying.
3. Remember, your user date in Dynamics GP has to be in the new year to do a check run after closing 2008.
4. Backup, backup, and oh, yeah, backup. And keep them...one from pre-year end, and one from post-year end.
5. Allow enough time to apply the updates, AND check your year end wage file.
6. In a crunch, you can let one person in to run a payroll without applying the updates to other workstations. Of course, the un-updated workstations cannot be used until the update is applied. But this will help you in a crazy pinch. And, I know, I know, it is contrary to what the guide tells you.
Now, if I can only manage to remember these little tidbits NEXT YEAR :)
She showed me that on one machine, if she tried to run an integration, it would appear to start and then would display the error "Unable to load SanScript Library". Like any good consultant, I immediately checked the KnowledgeBase and found a few articles that discussed the following possible causes and resolutions...
1. Corrupt install or corrupt IM.dic: Uninstall, clean registry, and then reinstall
2. Corrupt integration: Recreate integration
I knew she had been through #1 several times, so I tried #2 and still received the error. I tried using a backup IM.MDB, and still received the error. Hmmm.
So I started to think about the corrupt IM.dic mentioned in the article. And I started to think about how many of the articles mentioned the error when opening an integration, but she did not the error until she actually ran the integration (and that it appeared to run at first before displaying the error). It was as if it was having issues with pushing the data through the Dynamics GP client, as opposed to an issue with the Integration Manager application itself. And then I remembered that she launched Integration Manager from the start menu, because the option from Tools>>Integrate was not available to her.
I bet many of you already know where I am heading with this. I checked her Dynamics.set file (her launch file, through Tools>>Setup>>System>>Edit Launch File) and, sure enough, Integration Manager was not included in the launch file. The IM.dic was were it should be, in the GP directory, but the Dynamics.set was not even looking for it. We modified the launch file to include Integration Manager (by copying from the Dynamics.set on a working workstation), and the integration now runs perfectly.
This was on GP 9, not sure if others have run in to it on GP 10. But I thought it was an interesting find that I should share with others, to spare you the troubleshooting should you come across it. Happy New Year!