Friday, October 30, 2015

Dynamics GP SOP Return transaction distributions: A Puzzling Observation

By Steve Endow

While updating an eConnect SOP integration, I observed something that I thought was strange.

My integration was originally designed to only import Invoice transactions, but the client now needed it to also handle Return transactions.  Okay, no problem.

So I made the small changes necessary to import SOP Returns as well.  The import ran just fine, importing both Invoices and Returns, and I thought all was well.

While testing the new integration, the customer noticed that the invoice distributions were not being calculated correctly--I had forgotten to reverse the distributions for the Returns.  Not a problem, simple fix.

But while testing the import and checking the corrected distributions, I noticed something that puzzled me.  I would open the Sales Distribution Entry window for the Return, and everything looked fine:  CR Receivables, DR Sales.


But when I clicked OK, I received this message.


Huh.  Okay.  So I'm not familiar enough with SOP distributions to understand the logic behind this, but for some reason, GP requires that SOP Returns have COGS and INV distribution lines prior to posting, while Invoices do not.

If I clicked on the Default button, GP populated the window with the distributions that it was expecting.

Okay, so I learned something, but that part made sense.  I could deal with that requirement and understand it.

But here's where things got strange.

When the client tested with this same transaction, they did not get the COGS distribution dialog.  They only needed to have two distribution lines:  RECV and SALES.  GP was not requiring them to have a COGS and INV distribution.


I checked with the client, and they didn't know why they didn't need the COGS and INV distributions.  So I checked with the client's GP partner.

The initial thought was that maybe I didn't have default accounts for my test inventory items in Fabrikam.  Turns out the COGS account was blank, so I set that default account. But my eConnect import was still only setup to import Sales and AR, so that didn't affect the imported distributions, and GP still insisted that I have the COGS and INV distributions.

The next thought was that maybe the Item accounts weren't defaulting.  But those were defaulting fine and were populated.

While I was searching for an answer, the client confirmed that their Return transactions do not require, and do not have COGS and INV distributions--at all.

The GP partner then suggested that the distributions were simply not visible in the client's environment--GP was generating COGS and INV behind the scenes, but just didn't display them on the Inquiry windows or the Edit List reports.  This seemed unlikely to me, since my Edit List report was pretty explicit about those two distributions being missing.


The client even tracked down a GL JE from a posted Return and confirmed that AR, Sales Tax, and Revenue were the only three accounts affected.  So there were no hidden distributions.

I didn't buy any of the explanations so far.  They just didn't seem to make sense given the symptoms I was seeing.

So I manually entered a simple Return transaction in Fabrikam and took a look at the Distributions.  My Return with a single line using a test item had a COGS and INV distribution amount of $0.01.


So why were those distribution lines $0.01?  Because the item had a Standard Cost of $0.01.


So this told me that GP was requiring the Return to have COGS and INV equal to the Standard Cost of the item.  But the client didn't have COGS or INV distributions....

Okay, so what if the Standard Cost of the item was $0.00?  That's a great question--let's give that a try.

In GP 2010, you can just change the Standard Cost, but in GP 2013 R2, you have to use the Change Item Standard Cost utility.


And before you change the Standard Cost, this window will require that the item not be present on any purchase orders or unposted receipts/invoices.  Okay, done.

So I then changed the Standard Cost to $0.00.

And behold.  When I then entered a SOP Return, Dynamics GP no longer required me to have COGS and INV distributions.


So why would the client have a standard cost of $0.00 for inventory items?  That actually makes sense, and seems obvious in hindsight, as they sell a subscription service that does not have COGS or inventory value.  (Why they track those subscriptions as inventory items, I don't know...that's a different research project.)

I'm now assuming that all of the client's inventory items are zero cost, and will therefore never need COGS and INV distributions.  .

So that was a multi-hour journey of puzzlement over a $0.01 Standard Cost.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter






Missing buttons on the SOP Transaction window? Turn off the Action Pane OR Use the Go To button

By Steve Endow

UPDATE:  Thanks to Ian Grieve for letting me know that when the Action Pane is enabled, the buttons on the SOP Transaction Entry window will be listed under the Go To button on the Action Pane.  Mystery solved.




Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter









Thursday, October 29, 2015

If you add your SQL files to Visual Studio, you may want to remove the sql exclusion from gitignore

By Steve Endow

I'm not sure why I haven't run across this issue before--perhaps it has always happened and I didn't notice?

I recently migrated an old Visual Studio solution from SVN to Git in Visual Studio 2013.  After committing the project, I noticed that there were two files still listed under "Included Changes" of the Team Explorer tab.


Puzzled, I check the files in my solution.  They had the small blue plus sign next to them, indicating they had been added.

I tried to commit the solution again, but Visual Studio told me:

An error occurred. Detailed message: No changes; nothing to commit.

I Googled this, and not surprisingly, I ended up on a Stack Overflow thread discussing this issue.

It pointed me to the gitignore settings file and sure enough, there was the culprit.


And that explains why my SQL files were not committed.  Here is what my project looks like:


I have a folder named SQL where I store the SQL scripts related to the project.  I started doing this so that I could have a centralized place where all code, including SQL, could be tracked, versioned, and backed up.

So while I found the problem, it makes me wonder--why is sql/ excluded by default in the gitignore file?  This thread mentions that the SQL exclusion is apparently a "best practice", but there is no such reference in the docs, and I find that puzzling, at least in my situation.  Am I doing something wrong?  I can't imagine that it's some taboo practice to have a SQL folder in my project to store my SQL files.

What do other people do to manage their SQL scripts related to a project?

So now I have to go back through my other projects that I've setup with Git and see if the SQL folder and all of the SQL files have been ignored.  And I need to remember to modify my global gitignore on every one of my dev servers in my HyperV stable.

The learning never stops.  Maybe I need a Git sensei.


May you have many happy commits in your future.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Visual Studio 2013 Git source control meltdown

By Steve Endow

I admit that I'm probably at least 50% responsible for my own mess in this case, but it was an interesting and educational journey (aka a harrowing few hours).  This is my cautionary tale.

I created a project in Visual Studio 2010 back in April for a client that is on GP 2010. Since Visual Studio 2010 doesn't have native Git support, I jumped through all of the hoops and somehow figured out all of the convoluted steps to add get Git working with VS 2010.  I was proud of myself and thought I had it working well.  Let's just say I wasn't quite the superhero I thought I was.


6 months later, the client is now upgrading to GP 2015, so I needed to upgrade the project to GP 2015.  No problem, that should take maybe an hour tops, right?

Except that this is a Project Accounting integration, and I didn't want to install and configure GP 2015 with PA on my old GP 2010 server.  No problem, I've got a GP 2015 server with PA configured, so I'll just use the magic of Git to clone my source code repository and start development on the GP 2015 PA machine, which also has Visual Studio 2013.

I clone my repository and I'm feeling confident.  I update the references for GP 2015, update the project version, and I'm feeling proud.  Until I try and compile.  I get an error that a method isn't valid.  Hmmm.

I check the method, and see that it's in a different class.  I go to open the class...and...it isn't in my project. Ummmm.  That's odd.

I double and triple check, and the class is definitely not in the project.  I login to Bitbucket and check the code online, and the class isn't there either.  So how could I have written code, that works, that references a method in a class that doesn't exist?  Seems unlikely.

I then login to my old GP 2010 PA server and check the old project.  Behold, the class is in the project.  That's pretty strange, since I'm pretty good about committing my code regularly, and definitely when I prepare a release.

So I attempt to re-commit the old GP 2010 project using the Git Extensions tools with Visual Studio 2010.  No matter what I try, the class will not show up on Bitbucket.  Since I haven't yet learned Git command line kung fu, I'm stuck.  With VS 2013, I've seen an issue where a file randomly ended up in the "Untracked Files" section, but I wasn't able to find anything similar in this case on VS 2010.

So I go with Plan B: Copy the class from the old server to my new server, and rely on VS 2013 to get it committed to the branch.  Easy peasy, right?

So I do that, and it gets checked in and things are looking peachy.  I then create a new GP 2015 branch and start re-upgrading the project.  All done, for the second time, with my updates, I commit and push my changes.  I then try and switch back to my master branch...and it all goes downhill again.

I received an error while switching between branches.  Unfortunately I didn't screen shot that error, but after that happened, I was stuck in git branch purgatory.  VS thought I was in my GP 2015 branch, but the code was all from my old master branch.  And if I tried to switch branches, I would get errors like these.



After fighting these for an hour, I vented on Twitter and much to my surprise, I received a response from Buck Hodges, the Microsoft Director of Engineering for Visual Studio Online.  He recommended making sure I was on VS 2013 Update 5, and also strongly recommended VS 2015 for its improved TFS and Git functionality.

I downloaded and installed VS 2013 Update 5 and was eventually able to get around the "there are uncommitted changes" error, but I just wasn't able to get back to a valid version of my GP 2015 branch.

So, after several hours wasted, I did what I should have done earlier and started from scratch.  I copied the entire VS 2010 project from my old server, created a brand new Git repository, checked everything in using VS 2013, created a new GP 2015 branch, and made all of my changes...for the third time.

The new repository is now fine--just like every other new repo I've created with VS 2013.  I don't know what all I did wrong, but I am pretty sure that my problems started with Git on VS 2010.  I'll never do that again since I know that I just don't have the Git "skillz" to dig myself out of a hole.

The only small consolation is that I'm not the only person who has experienced this issue.  I found a few threads on Stack Overflow that let me know I wasn't the first person to blaze this trail.

Thankfully my projects are pretty simple, so as long as I have a good copy of the code, I'm usually able to scrap a repository and start from scratch.

Good luck, and always try to have a backup of that source code...

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Friday, October 23, 2015

GP 2015 DAG.exe crashes: Could not load type Microsoft.Dynamics.GP.BusinessObjects.Payroll.Timecard'

By Steve Endow

I've been aware of the Dynamics GP Visual Studio Tools "Dictionary Assembly Generator" (dag.exe) for quite a while, but surprisingly I've never had to use it before.  I haven't had to use VS Tools with modified windows or third party dictionaries, so I've always been able to just access the standard GP forms with VS Tools.

But I have a new project where I need to access some modified GP windows, so I need to use the DAG to get to the custom fields.

The client is on GP 2015, so I get the VS Tools SDK installed on my GP 2015 server, get the command line setup, and then run the DAG.

And I get a dag-splosion.  (dag-plosion?)

It starts up okay and looks like it's working fine, but then at the "Generating IL code" step, around 69%, kaboom.


I had GP 2015 R2 (14.00.0725) installed from the R2 iso, and had installed the VS Tools SDK from that iso.  I tried installing the latest October hotfix (14.00.0817), but that didn't help.

My next random test will be to downgrade and try installing the VS Tools SDK and perhaps also GP 2015 from the RTM iso.

Otherwise, I've got nothin'.

If anyone has any suggestions, I'm all ears...


UPDATE: I found a few forum posts with a similar DAG problem (but different GP versions), and those threads mentioned having to copy some files from the GP or AddIns directory into the VS Tools SDK folder to get the DAG to work.  David Musgrave also indicated that in his case, he had to copy the BusinessObjects.dll file.

After trial and error and then selectively copying individual DLLs, I found that I had to copy 3 DLL files from my GP2105 folder to the VS Tools SDK folder to get the DAG to work:

Microsoft.Dynamics.GP.BusinessObjects.dll
Microsoft.Dynamics.GP.ManagementReporterServices.dll
Microsoft.Dynamics.GP.Svc.Application.dll

With those files copied, the DAG no longer crashes for me during the Generating IL code stage.

However, when it gets to the "Creating Application.MicrosoftDynamicsGp.dll" step, the process stays at 99% for several minutes.  It seems like it has hung, but it did eventually finish.


So I hope this DAG-tastic solution helps some other poor soul.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

http://www.precipioservices.com

Friday Fun: Dynamics GP lets you delete a Salesperson ID that is assigned to customers, causing eConnect error

By Steve Endow

I thought this was interesting.

A client received the following puzzling eConnect error when updating a customer address:

Error Number = 249  
Stored Procedure= taUpdateCreateCustomerRcd  
Error Description = The Salesperson does not exist in the Salesperson Master Table
Node Identifier Parameters: taUpdateCreateCustomerRcd
CUSTNMBR = SUPERCUSTOMER
SLPRSNID = Note: This parameter was not passed in, no value for the parameter will be returned.



< taUpdateCreateCustomerRcd >
  < CUSTNMBR >SUPERCUSTOMER< /CUSTNMBR >
  < CUSTNAME >Super Duper Customer< /CUSTNAME >
  < ADRSCODE >MAILING< /ADRSCODE >
  < ADDRESS1 >111 Super Duper Lane< /ADDRESS1 >
  < CITY >Los Angeles< /CITY >
  < STATE >CA< /STATE >
  < ZIPCODE >90001< /ZIPCODE >
  < CCode >USA< /CCode>
  < COUNTRY >United States< /COUNTRY >
  < PHNUMBR1 >1231231234< /PHNUMBR1 >

< /taUpdateCreateCustomerRcd >



In the error, eConnect is complaining that the Salesperson does not exist, yet indicates that the Salesperson ID was not submitted as part of the customer update.

What gives?

The client explained that a salesperson had two Salesperson ID records in GP, so they deleted one.  Well, the deleted Salesperson ID was assigned to customers.  When eConnect attempted to update the customer with the now deleted Salesperson ID, the update failed, because eConnect validates the Salesperson ID regardless of which customer fields are updated, and that Salesperson ID no longer existed.

Surprised by this whole situation, I created a new Salesperson in GP, assigned it to a customer, then deleted the Salesperson.

Sure enough, GP let me delete the Salesperson without any warning or error.  And that left my customer record with an invalid Salesperson ID assigned.

In this screen shot, notice that STEVE2 is assigned to my customer, but no longer exists in the salesperson list.


I then made a small change to the customer and saved it, keeping the invalid Salesperson ID.  It saved fine. telling me that the Customer Maintenance window does not validate the Salesperson ID, so it saved the customer with the invalid Salesperson ID value.  Lovely.

So now my client needs to run a query that checks all of the Salesperson ID values assigned to customers to see if any others are invalid values.

Fun stuff.

Happy Friday.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Tuesday, October 20, 2015

eConnect Error: Voucher Number already exists in PM Keys Master Table PM00400

By Steve Endow

At GPUG Summit 2015, I gave a presentation "5 Tips for Developing GP Integrations".  After the presentation, two attendees indicated that they were receiving these types of errors when import AP vouchers using eConnect:

Error Number = 305  Stored Procedure= taPMTransactionInsert  Error Description = Document Number (DOCNUMBR) already exists in the PM00400, PM10000, PM20000, PM30200, POP10300 or the MC020103 Table

Error Number = 306  Stored Procedure= taPMTransactionInsert  Error Description = Voucher Number (VCHNUMWK) already exists in PM Keys Master Table - PM00400


I'm very familiar with this error, as it occurred on my first eConnect integration, which I believe was with Great Plains 7.5, not too long after eConnect was released.

I could be wrong, since I was new to eConnect back then, but my recollection at the time was that there was no eConnect method or stored procedure to get the next voucher number.  I attempted to reverse engineer and reproduce the code that the Great Plains client used to get the next voucher number, but I apparently didn't have the SQL skills at the time to prevent duplicate numbers from being issued to my voucher integrations while 15 accounts payable clerks were pushing invoices into Great Plains all day.

However, in the years since, and now that I'm using the taGetPMNextVoucherNumber procedure, I haven't seen that error.  Perhaps all of my AP integrations since then have been in environments where there isn't much other AP import or entry activity, but my impression was that the taGetPMNextVoucherNumber stored procedure is pretty robust.

However, since I had two people in my GPUG Summit session tell me they were getting this issue, out of curiosity, I just setup 3 instances of an AP voucher import and had each of them import 1,000 vouchers at the same time.  They were running fine for a while, but then it happened.


Sure enough, having 3 simultaneous eConnect voucher imports produced the error pretty quickly.  So I opened up the procedure to see if I could tell how this could occur.

If you look at the main statement in the stored procedure, it would appear to be locking the PM next number table:

SELECT @O_vCNTRLNUM = NTVCHNUM
FROM PM40100 WITH (TABLOCKX HOLDLOCK);

However, looking at the rest of the procedure, I noticed that something was missing: The stored procedure doesn't use a SQL transaction!

So the procedure locks the table when selecting the next voucher number from PM40100, but after retrieving that next number it moves on to validate the number.  When it moves on, without an explicit transaction, this unlocks the PM40100 table, allowing another process to grab the same number from PM40100 before the incremented number can be written to the table.

As a test, I modified the stored procedure to add a transaction, along with appropriate rollbacks and a commit.

With the modified taGetPMNextVoucherNumber procedure in place, I launched 6 instances of my eConnect Batch Load Test tool and imported a total of 30,000 AP vouchers over 5 different runs.  The error didn't occur.



So it appears that adding a proper SQL transaction to the taGetPMNextVoucherNumber stored procedure will resolve the issue of duplicate voucher numbers during an eConnect import.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Monday, October 19, 2015

Thousands of orphaned records in SY00800 and SY00801 cause batch posting to be very slow

By Steve Endow

I just finished a call with a Post Master Enterprise customer who was experiencing posting issues and errors in one of their Dynamics GP companies.

Batches would post fine in Company 1, but when Post Master tried to post a batch in Company 2, the batch would take 5 minutes to complete and conclude with a strange error, when it should take less than 20 seconds.

The customer's environment is moderately complex, with numerous complex system integrations running constantly, so we initially thought that maybe the integration was locking some tables.  But the same integration was importing data into both Company 1 and Company 2, so that didn't seem to explain why only one company was slow when posting.  We even tried turning off the integration, but batches still posted slowly.

I then ran a diagnostic database trace utility that is included in Post Master, and I happened to notice that there were a ton of records in the SY00800 table.  We queried the table and found that there were almost 2,500 orphaned records in the SY00800 related to Company 2.  We then checked SY00801, and similarly, there were 2,500 orphaned records in that table as well.



We deleted the Company 2 records from both SY00800 and SY00801 and performance instantly improved.  A few test batches posted in under 20 seconds.  No more errors.  Deleting those orphaned records appears to have resolved the issue.

After working with Post Master and learning about the Dynamics GP posting process for over 6 years now, I still run into very strange situations with client environments that leave me baffled.

For example, how is it possible to get 2,500 orphaned records in both SY00800 and SY00801?  The mind boggles.

Anyway, it's probably a good idea to occasionally check those tables and clear them out if you see a lot of orphaned records.

Happy hunting.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Thursday, October 8, 2015

Dynamics GP "new code" message appears every time you launch GP

By Steve Endow

This week I upgraded and messed with my GP 2015 installation as part of a support case.  At some point, I started to get the "New code must be added" dialog box every time I launched GP.


I clicked on Yes and then logged into GP, but it appeared the next time I launched GP.  I tried Run as Administrator, clicked Yes, and logged in.  But it still kept appearing.

I finally stopped and thought for a second.  I searched my GP 2015 directory for *.cnk files.  And there I found tautil.cnk.  I don't know how it got there or why it was there.  I already had the TAUTIL.DIC dictionary in the directory, and had it listed in the Dynamics.set file.

My only guess is that perhaps the dictionary was updated as part of the hotfix I installed, and for some reason it failed to unchunk.

I deleted the tautil.cnk file and the New code message went away.

Problem solved, mystery remains.

UPDATE: A reader makes a good point, which I also pondered later, that the better approach is to probably rename the existing dictionary and let GP unchunk the cnk file, in case the dictionary was not successfully updated.  In my case, since it was only PSTL / TAUTIL, and it was on my development machine, it wasn't a concern, but in a real environment, renaming the existing dictionary is probably the more prudent approach.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Monday, October 5, 2015

GP 2015 VBA UserInfoGet "Permission Denied' error resolved in hotfix and 2015 R2

By Steve Endow

I don't do much Modifier & VBA anymore (mostly VS Tools for customizations), and have only had a handful of customers upgrade to GP 2015, which is probably why I haven't run into this error until today.

While testing an upgraded VBA customization on GP 2015 today, I received this error.

Run-time error '70':  Permission Denied

When I clicked on Debug, I was taken to the call to UserInfoGet.CreateADOConnection.


Thankfully there is a Community Forum post on this error, and the discussion indicates that this was an issue with GP 2015 RTM.  A post on that thread indicates that GP 2015 version 14.00.0661 resolves the issue, and I see that I even posted to the thread confirming that the problem was resolved with 661.

But today the problem occurred for me while running 0661, and I just confirmed that my client is running 0661 and is receiving the error.

After upgrading to the October 2015 Hotfix (KB 3089874, 14.00.0817, post R2 update), the error went away and my VBA code and UserInfoGet calls work fine.

So it seems that somehow the problem can still occur with version 14.00.0661, but is resolved in a newer hotfix.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Oddities- Missing Project Accounting Accounts

So we all know (or are constantly reminded) that the simplest answer is usually correct.  So when a client recently called to report that their project accounting accounts were missing (PA43001), we immediately started brainstorming "simple" explanations...

1. Maybe they aren't really "gone" (no, they really are gone)
2. Maybe somebody removed them (but how, I mean it's just the COGS accounts and they are simply gone)
3. Maybe someone working on another support case inadvertently removed them (not likely, as they were there at 2pm and gone at 4pm, and no cases were being worked in that time)

To be noted, the client is on GP2013 R2/YE 2014 (12.00.1801).

My coworkers sometimes give me a hard time because the last thing I tend to consider is an actual bug in the software.  The reason I avoid this explanation is that it is too easy, and often is not the case. And particularly when data just disappears with no other process/issue/explanation, it doesn't seem likely that the software just decided to dump the data without provocation.  So I like to make sure we exhaust other routes.  So we went about fixing the issue in this case, restoring the accounts, but it was still bothersome that we could offer no explanation as to why it happened (again, racking my brain on a simple answer).

So we start a case, and found out that this is indeed a quality report (#9120 to be exact).  An apparent bug in GP, that several clients have reported but Microsoft has been unable to replicate.  Odd. Very odd.  The good news being two-fold- first, of clients who have reported it, no one has had a second instance of it.  And second, Microsoft GP support has a script you can run to create a shadow table that will track the project posting accounts table so you can monitor if something were to recur.

What's the lesson in this?  The simplest explanation may just be that it is indeed a software bug.  I guess that's it?

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, October 2, 2015

Finding batches with errors across all Dynamics GP company databases

By Steve Endow

A customer contacted me with an interesting problem.  They utilize Post Master Enterprise to automatically post batches in over 150 company databases.  The automatic batch posting is working fine, but they occasionally have some batches with errors that go into batch recovery.  Post Master sends them an email message for each batch that goes into recovery, but with over 150 company databases, they wanted a way to generate a list of all problem batches across all of their company databases.

I haven't done a ton of research into batch recovery and how GP detects which batches to list in the batch recovery window, but based on a quick check of some failed batches in Fabrikam, it looks like the BCHSTTUS field in the SY00500 table was a good place to start.

This KB article lists the different values for the BCHSTTUS field.

https://support.microsoft.com/en-us/kb/852420


So now we can create a query like this:

SELECT * FROM TWO..SY00500 WHERE BCHSTTUS > 6


That's a start, but it isn't a great solution if we need to run the query in over 150 different databases.

So after some digging, I found this StackOverflow thread and used the last suggestion on the thread.


CREATE TABLE #tempgpquery
(
[DB] VARCHAR(50), 
[Records] INT
)

DECLARE @db_name varchar(10)
DECLARE c_db_names CURSOR FOR
SELECT INTERID FROM DYNAMICS..SY01500

OPEN c_db_names

FETCH c_db_names INTO @db_name

WHILE @@Fetch_Status = 0
BEGIN
  EXEC('
    INSERT INTO #tempgpquery
    SELECT ''' + @db_name + ''',COUNT(*) FROM ' + @db_name + '..SY00500 WHERE BCHSTTUS > 6
  ')
  FETCH c_db_names INTO @db_name
END

CLOSE c_db_names
DEALLOCATE c_db_names

SELECT * FROM #tempgpquery

DROP TABLE #tempgpquery


It looks complex due to the temp table and cursor, but it's actually a fairly straightforward query.

You can modify the query to do whatever you need, and in this case it just does a count of records in SY00500 where the batch status is greater than 6 (the red text).  I queried the list of valid company database names from the SY01500 table, and use that list in a cursor to loop through each database and query it.

It seems to work very well.

But like I said, I'm not 100% sure if the Batch Status field is the only indicator of batch recovery, so if anyone has more info on the query to properly detect batches that have gone to recovery, please let me know.


UPDATE:  The very clever Tim Wappat took up the challenge to find a simpler and cleaner way to perform the query.  He uses the novel approach of building a UNION ALL statement that is replicated through a join against sys.databases.  The results are the same, but his query avoids the use of both the temp table and cursor.  It is rather compact, which makes it a little more difficult to decipher, but it is pretty elegant.  For his superior submission, Tim wins 100 Internet Points.


DECLARE @sql NVARCHAR(MAX);

SET @sql = N'DECLARE @cmd NVARCHAR(MAX); SET @cmd = N'''';';

SELECT @sql = @sql + N'SELECT @cmd = @cmd + N''UNION ALL
SELECT ''''' + QUOTENAME(name) + ''''', COUNT(*) FROM ' 
  + QUOTENAME(name) + '.dbo.SY00500 WHERE BCHSTTUS > 6 ''
WHERE  EXISTS (SELECT 1 FROM ' + QUOTENAME(name) 
 + '.sys.tables AS t
 INNER JOIN ' + QUOTENAME(name) + '.sys.schemas AS s
 ON t.[schema_id] = s.[schema_id]
 WHERE t.name  = N''SY00500''
 AND s.name  = N''dbo'');'
FROM sys.databases WHERE database_id > 4 AND state = 0;

SET @sql = @sql + N';
SET @cmd = STUFF(@cmd, 1, 10, '''');
PRINT @cmd;
EXEC sp_executesql @cmd;';

PRINT @sql;
EXEC sp_executesql @sql;



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter