Tuesday, September 28, 2010

Dynamics Database. Company Information.

So, first, I have to say that Steve has been putting me to shame with his prolific blog posts the past few weeks. Believe it or not, I am alive and working. Busy but not crazy, just enjoying the changing of the seasons here. We have had some lovely fall weather the past week, a nice little break before the reality of winter sets in.

In an attempt to challenge Steve's dominance of the blog lately, I thought I would share a reminder of sorts as it relates to the Dynamics database. I know I am guilty of sometimes minimizing the significance of the Dynamics database, particularly when it comes to what version of the database is restored if there is a system failure. Of course, I remind the client that if security settings, yadda, yadda, yadda have changed since the backup then they will lose those changes. But, generally, security settings are not changed all that often and the difference between restoring a Dynamics database from this week versus last week is often minimal if a difference exists at all.

But today I received a reminder of the importance of the Dynamics database when it comes to company setup. Yes, I said company setup. Of course, the vast majority of company setup information is stored in each company's database. But there are a few critical pieces that are actually stored in the Dynamics database. So, if your restore point is prior to when those setup pieces were completed for a company, then you will indeed have issues with that company.

  • Company Master (Setup>>Company>>Company): If you restore the the Dynamics database to a point prior to when the company was actually created, it will not be part of the Company Master table and therefore will not be available for selection when logging in or defining security. The best route here (in my humble opinion) is to run through GP utilities to create the company, and then just restore a backup of the company database over the new empty company created. This ensures that all system tables are properly populated with the company information. However, this approach does not mean that your original company settings stored in the Dynamics database would be recovered. You would still need to return to Setup>>Company>Company to complete the setup, and don't forget the critical Options window. The company setup options can have significant impact on how the company functions with regards to taxes, distributions, and other items.
  • Multicurrency Access (Setup>>System>>Multicurrency Acccess): This stores the company access to specific currencies and exchange tables. You might have all of your multicurrency setup completed, but if this piece does not exist in the Dynamics database you will run in to multiple errors trying to use multicurrency. As this is a pretty simple access window, setting it up manually if the data is missing is usually the simplest solution.
  • Intercompany (Setup>>System>>Intercompany): If used, this defines the intercompany relationships for Dynamics GP. Again, this is a pretty simple setup window, so the quickest solution is to re-enter the configuration manually if the data is missing for the company.
  • Security (Setup>>System>>Security Roles, Security Tasks, Alt/Mod Forms and Reports, User Security): As I mention above, any security defined for the new company will be missing, if you restore the Dynamics database to a point prior to the set up being completed.

Well, that's my list of items that are sometimes forgotten when restoring a Dynamics database to a point earlier than the creation/setup of a new company. Of course, let me know if I missed anything or if you have any tips/tricks to share.

Friday, September 24, 2010

Dynamics GP Visual Studio Tools AddIns Won't Load

I have been working with a client over the last week to troubleshoot a problem where VS Tools AddIn DLLs will not load on their Dynamics GP 9 server.

Normally the "installation" of a Dynamics GP add in is very simple--just copy the DLL to the AddIns directory, then launch GP, and it works.  It's a wonderfully easy and reliable deployment process.

However, with Dynamics GP 9, there are two small caveats.  First, the Visual Studio Tools runtime must be installed.  Second, the AddIn directly does not exist by default, so it must be created.  Pretty simple tasks.

The client dutifully installed the VS Tools runtime, then created the AddIns sub-directory in his GP application directory.  He then installed the third party product, which copied three DLLs to the AddIns directory.

But when he launched GP, the AddIn didn't load.

He then tried uninstalling the add in, uninstalling VS Tools Runtime, and then reinstalling both.  Nada.

Thinking that maybe there was a problem with the third party add in, we tried the sample Freight Estimator add in.  But even that wouldn't load.

We then checked permissions, but he was logged in as a local Administrator on the Server 2003 machine, and he was using the sa login for GP.  We even checked directory permissions on the Dynamics GP application directory and the AddIns directory, and opened up permissions on both of those to see if that helped.  Still wouldn't work.

We then checked the path of the icons used to launch GP to verify that it was the same directory where the AddIns were installed.  All looked good.


Having exhausted all of the standard troubleshooting steps, we submitted a support case.  The support engineer had us verify all of the same tests to make sure we didn't miss anything.  He then sent us a simple test add in DLL that would display a message box as soon as it loaded, but even that didn't work.

The client finally had a streaming session with the support engineer who eventually used the Process Monitor tool to trace the behind the scenes GP activity.  In the 137,000 records that scrolled by, the engineer eventually found a strange error.


Dynamics GP was getting a path not found error when trying to read and write to the AddIns folder.  What was strange was that the path that was listed in the Process Monitor log was \\remoteserver\data\Apps\GP9\Dictionaries\AddIns.

The very sharp engineer then took a look at the client's Dynamics.set file.  Everything looked normal, except for one line.  The Dynamics.dic file path was set to the same network share as the Reports.dic and Forms.dic.  For some reason, the client's IT staff had made the Dynamics.dic a shared dictionary.

Putting the two together, he noticed that Dynamics GP was looking for an AddIns directory in the same location as the shared Dynamics.dic file.  He, and I, thought (assumed?) that GP looked for the AddIns subdirectory in the same location as the Dynamics.exe and Dynamics.set, but apparently this is not the case.  It seems that GP expects the AddIns subdirectory to be in the same directory as the Dynamics.dic file.  Normally, the Dynamics dictionary is located in the local GP application directory, and there is no reason to share it, so that statement is usually true--but only by coincidence.

Having deduced this much, they modified the Dynamics.set file to point to the local Dynamics.dic, and viola, the AddIns suddenly started working.

Kudos to the Dynamics GP developer tools support team for tracking this one down!

Thursday, September 23, 2010

eConnect 2010 Service Pack 1 Released!

If you haven't heard yet, Dynamics GP 2010 Service Pack 1 was just released.  Make sure to read all that fun fine print before you dive in, as always.


https://mbs.microsoft.com/customersource/downloads/servicepacks/mdgp2010_patchreleases.htm


But of slightly more interest to me at the moment is the news that eConnect 2010 Service Pack 1 was also just released.

https://mbs.microsoft.com/customersource/downloads/servicepacks/MDGP2010_eConnectservicepackshotfixes

There are separate msp files for x86 and x64.

The fix list hasn't been published as of this post, but I'll likely be applying it to my GP 2010 development servers soon.

Partial Explanation of eConnect 2010 "Distributed Transaction" Warnings

In my last post, I discussed an issue where my Windows Event Log filled up with thousands of eConnect 2010 warning messages saying "Distributed transaction was used."

Many thanks to guru Louis for responding to the post and giving me some great info explaining what the eConnect 2010 "Distributed Transaction was used" warning meant and why it might be occurring.

First, just a reminder that eConnect 2010 is working great, and my data was importing into GP 2010 just fine--I'm only receiving warnings, not errors, so they are not interfering with my integration.  I just happened to notice the hundreds of warnings because my Event Log filled up.

Louis explained that for performance reasons, distributed transactions are not used or enabled by default with eConnect 2010, so what I am seeing is eConnect 2010 issuing a warning to let me know that my import is incurring the extra overhead of a distributed transaction.  He then offered some pointers to track down why my import was invoking distributed transactions.

In the case of my customer import, I wasn't intentionally doing anything related to a distributed transaction, and honestly, before today I didn't know enough about them to know why or when they might be used.  I checked my connection string to confirm that it wasn't changing during my import, and I confirmed that I hadn't enabled any settings to turn them on.

I then went ahead and upgraded the eConnect 9 SOP invoice and cash receipt imports that would accompany the customer import.  I was afraid they would have the same problem, but to my surprise, neither wrote a single entry to the eConnect event log.  They worked perfectly.  Hmmm.

So it was only my customer import that issued the warnings, which seemed really strange.  The customer record is quite simple, so I couldn't imagine what would require a transaction.

And then I remembered something.  As part of my customer import, I am importing e-mail addresses.  And as all good students of GP and eConnect know, there is no handy e-mail address field on the customer window.  It's tucked back in the Internet Information window in GP, and in the taCreateInternetAddresses and CMPInternetAddressType in eConnect.  So...what if sending these with the customer record were somehow causing eConnect to  invoke a distributed transaction?

So I commented out the code that adds the internet address type to my customer XML, and re-ran my integration.  Viola!  No entries in my eConnect event log!  Hmmm, so for some reason the presence of the customer internet addresses is triggering a distributed transaction on one of my servers. 

This is a great find, but as the title says, it's only a partial explanation.  This problem only occurs on my 32-bit server, and doesn't occur on my 64-bit server or the client's 64-bit server.  So my guess at this point is that it might be a configuration issue with my 32-bit machine, or a quirk in the 32-bit version of eConnect 2010.

Fortunately it isn't a show stopper, just a head scratcher, so it won't interfere with my client's GP 2010 upgrade schedule. And in the process, I learned alot more about eConnect 2010 alot faster than I would have if everything worked perfectly.

From that learning, I have a few more ideas for eConnect 2010 posts on deck...

eConnect 2010 Fills Event Log with Warning "Distributed Transaction was used"

After converting a Dynamics GP eConnect 9 integration to eConnect 2010, I was doing some testing, and everything looked fine.  Customers were being created in GP 2010, and everything was working smoothly. However, after several rounds of testing, I received an error:

The event log file is full

It took me a few seconds to understand what this meant--was this referring to the Windows Event Log?  This was a bit puzzling, since my integration isn't writing to the Event Log directly.  I opened the Event Viewer and poked around, but didn't see much.  But when I then selected the eConnect event log, I saw hundreds of Warning messages that had filled the log to it's default capacity of 512K (and overwrite events older than 7 days).


All of these eConnect warnings had no Event ID, and all had a Description that starts with this annoying message:

The description for Event ID ( 0 ) in Source ( Microsoft.Dynamics.GP.eConnect ) cannot be found. The local computer may not have the necessary registry information or message DLL files to display messages from a remote computer. You may be able to use the /AUXSOURCE= flag to retrieve this description; see Help and Support for details. 

Finally, half way down the Description, this information was provided:


The following information is part of the event: Distributed Transaction was used

This could be caused by new connection strings used within each xml document, but reusing the base transaction scope.

Configuration Setting 'ReuseBaseTransaction' is by default FALSE. Remove this configuration setting, or set it to FALSE if this was not the expected behavior.


Uhhhh, okay.  I am not doing anything special regarding "Distributed Transactions", so this didn't mean much to me, and after reviewing the eConnect 2010 Installation guide and Developer's guide, I couldn't find any reference to Distributed Transactions or the ReuseBaseTransaction parameter.

I even check the Microsoft.Dynamics.GP.eConnect.Service.exe.config, and ReuseBaseTransaction is not a default parameter, so I'm not yet sure where that might be located.

After doing more testing, it appears that for every customer record that I submit to the eConnect 2010 CreateEntity method, over 300 of these warning messages are recorded in the eConnect event log.  Pretty haywire.  To keep the Warnings from causing problems with the integration, I increased the size of the eConnect log and set it to overwrite events as needed.

At 10pm, lacking any other bright ideas, I compose the text for a support case and forward it to the GP partner that I'm working with for submission.  After a few e-mail exchanges, he asks if I've tried the integration on another server.  Brilliant, and painfully obvious idea that just didn't occur to me late at night.

So at 11pm, I install a fresh copy of GP 2010 and eConnect 2010 on another server, get my Visual Studio project setup, and then run the integration.  No Warnings.  In fact no activity at all in the eConnect Event Log.  Clean as a whistle, as my grandmother says.

So, the good news is that the error appears to only occur on one of my servers.  The not so good news is that Server A is a 32-bit Server 2003 machine with SQL 2005, and Server B is a 64-bit Server 2008 machine with SQL 2008.  Since the servers are so different, and since there are separate 32-bit and 64-bit eConnect 2010 installs, it is quite possible that the different configurations played a role in Warning vs. No Warning.  And the bad news is that I don't yet know what the Warning means or what is causing it.


I'll be converting two other integrations to eConnect 2010 and deploying them on the client's test server, so we'll see if the Event Log Warnings occur there.  I'll post an update if I learn more.


UPDATE:  Please see my follow up post on this topic.


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

http://www.precipioservices.com

Wednesday, September 22, 2010

eConnect 2010 Error: There was no endpoint listening at net.pipe

If you are working with eConnect 2010, you may run into the following error at some point:

There was no endpoint listening at net.pipe://localhost/Microsoft/Dynamics/GP/eConnect/EntityOperations that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details.

There are probably several possible causes of this error, but one of the simple causes is that the eConnect 2010 Integration Service is not running on your server.

On my GP 2010 server, it seems that the eConnect 2010 service installed with a Startup Type of Manual.  I installed it several months ago, but I don't recall changing the setting from Automatic to Manual.

In any event, if you notice that the "eConnect for Microsoft dynamics GP 2010 Integration Service" is not running, that would likely explain the "no endpoint listening" error that you are receiving.

Just start the eConnect 2010 Integration Service up, and then try your integration again.

If the Integration Service is running on your server, then you can check the Configuration.ServiceAddress value that you are using in your integration and confirm that you are using a valid URL.

Other than that, I'm going to be looking into the details of the protocols and ports required for the eConnect 2010 / WCF communication.  Since I ran into internal firewall issues with eConnect 10 and DTC, I want to be prepared for similar situations with eConnect 2010.

I'm in the process of migrating some GP 9 eConnect integrations to GP 2010, so I'll probably be learning more, and posting more, about eConnect 2010 in the near future.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

http://www.precipioservices.com

Compliments to the Dynamics GP Documentation Team

At the 2009 Dynamics GP Technical Conference in Fargo, I met a member of the Dynamics GP team that maintained the technical documentation, such as the eConnect and Visual Studio Tools help files.  (Sorry, it was a year ago and I don't remember is name...although I probably have his card somewhere...)

I told him that I used the eConnect Programmer's Guide help file constantly as a reference, so he told me that if I found any issues or had any suggestions, that I should definitely use the "Documentation Feedback" link at the bottom of every page in the help file.  He assured me that they do receive the feedback, and that every submission is reviewed.

Since then, I've found a few small typos or discrepancies in the GP 10 documentation, and I've submitted feedback for those to get corrected.  And sure enough, I have received a response from someone on the documentation team for every submission.

Well, today I was finally digging into the eConnect 2010 Programmer's Guide help file, and I was thrilled to see that they had reorganized the XML Schema Reference section.  The help file now lists the Type objects in the Content pane on the left, effectively adding an additional level of navigation to the Content pane.



This seemingly small change is a huge improvement, making it much easier to quickly find Types and Nodes, whereas in the eConnect 10 file, I had to page down through the list of Types and Nodes.

I really appreciate this change, and the fact that they are obviously paying attention to the details and looking to improve the documentation.  So, I just clicked on that Documentation Feedback link at the bottom of the help file and sent them my thanks!

Give it up for the documentation team!  Yeah!  (And the crowd goes wild...)

Tuesday, September 21, 2010

Archiving Integration Source Data

Yesterday I received an e-mail from a client inquiring about an AP credit memo that did not import into Dynamics GP properly--it imported into GP as an AP invoice.  After looking into it, I confirmed the problem and then looked at my eConnect integration code to determine the cause.

It turns out that I had made a change to the code back in May during testing, and that change caused the problem that was identified this week--all credit memos were importing as invoices.  Oh boy.  So that means that the integration has been running for 5 months with the problem.  The bug wasn't detected during testing, and somehow wasn't detected in production either.

Fortunately, the bug only affected AP credit memos, and it seems the client has processed very few of those, which explains why it wasn't caught earlier.

So now that I know the cause of the problem, other than fixing my code, how do we correct the credit memos that were imported into GP as invoices?  Well, in this case, the data in GP has no additional clues to tell us that an invoice should have been a credit memo (the invoices and credit memos look the same, only the transaction type is different), so my only option was to go back to the source data.

Fortunately, whenever I develop an integration, I keep a copy of every single transaction that I import.  If the source data is in SQL, I keep a transaction log table to record all of the data I received from the source system, when the data was imported, and the corresponding GP transaction that was created.  If the source data is Excel, I write back a status field to the Excel file, and then archive the file.  And if the source data is CSV, I just archive the file.

For archiving source data files, I use a third party .NET library to create zip files, so after a file has been imported, I move the original Excel or CSV file to a monthly archive zip file.

In this particular case, I had a copy of every source CSV file that had been provided by the external system.  To find all of the credit memos, I simply unzipped all of the CSV files to a single directory, then ran this interesting DOS command (that I had to look up):

copy *.csv AllFiles.csv

This automagically combined the hundreds of source CSV files into one CSV file.  And once I opened that single CSV file in Excel, I sorted the data to identify all of the credit memos that had ever been sent to GP by the external system.

In just a few minutes, I found that there were only four transactions that were affected, all of which were imported in September, and I was able to quickly document their voucher numbers so that the client could void them.  Quick and simple.  (Well, except for the embarrassment of finding the bug in my code.)

Previously, I've had situations where the vendor for the external system claimed that a file was sent when it wasn't, or that the file contained certain data when it didn't.  So this approach of keeping copies of all integration source data has been very valuable over the years.

Friday, September 17, 2010

eConnect Requester and eConnect_Out problem prevents entry of Purchase Orders

Today I tried to enter a purchase order in my client's GP 9 Test database.  As soon as I tabbed to the PO number field, GP would hang, and after a while would come back saying that "a unique po number couldn't be assigned".

I checked the POP setup window and verified that there was a next PO number setup, and even tried changing the number.  Still no luck.

I found this old newsgroup thread where the brilliant Mohammad Daoud and Mariano Gomez offered some suggestions to this problem.  After trying Mohammad's suggestion (add and remove the alpha prefix), and even Mariano's scripts to reset the next number, the error still persisted.

The next step was to pull out my good buddy SQL Profiler.  After starting a trace, I tabbed to the PO number field, and 6,000 SQL statements whizzed by in Profiler.  Fun.

Fortunately, as Mariano explains in his excellent blog post, the 6,000 statements were just GP trying to get the next document number 1,000 times. 

I looked at the different statements, and tested them, and everything looked fine--GP would get the next number from the POP setup table, make sure it wasn't in use, and then proceed to use it to create a new record in POP10100.  But then it would do the same thing again.  And again.

So I finally took POP10100 insert statement from SQL Profiler and tried to run it manually.  Viola, there was my error.  It was an error on the eConnect_Out table, indicating that there was no default value for the DATE1 field.  When the POP10100 insert occurred, the eConnect Out trigger was causing the POP insert to fail.  This, along with a few other weird errors gave me a clue that something was wrong with the eConnect setup on this test database.

I can't begin to explain the exact cause of the problem or how the Test database broke, but to resolve this, I first renamed the eConnect_Out, eConnect_Out_Setup, and eConnectOutTemp tables, to eConnect_Out1, eConnect_Out_Setup1, etc.

I then ran the eConnect 9 setup to install eConnect in a new company database.  I specified the Test database in the connection string and clicked install.  The eConnect_Out tables were recreated as part of the setup process.

I then went back into eConnect Requester Setup, configured it to track Purchase Order inserts and updates, and then tried entering a new PO.

Fortunately, that did the trick and GP was able to grab the next PO number and insert a record into POP10100.

After thinking about it, one thing that concerns me is if this would have happened on a GP 10 or GP 2010 install.   With GP 9, the eConnect install is separate, but with GP 10 and 2010, it's included in the GP setup, so I'm not sure how I would recreate the tables.  There may be a way, but I've never had to consider it.

Whew!  Is the day over yet?

eConnect Requester Crashes with "Run-time error 3265: Item cannot be found in the collection..."

Today I was trying to setup an eConnect Requester integration on a client's test database.  I launched eConnect Requester, changed the connection to point to the Test database, and when I clicked OK, I received this error:


Run-time error '3265':  Item cannot be found in the collection corresponding to the requested name or ordinal.

eConnect Requester works fine with the production database, and lists all of the available eConnect Out objects, but as soon as I pointed it to the test DB, it would crash.

After puzzling over it for a while and trying a few different things with no success, I went back to the Requester and looked at the connection string settings.  When the connection window opened, for some reason the Provider tab defaulted to the Microsoft OLE DB Provider for ODBC Drivers. 

I saw this at the time, and paused for a second, but I figured that the ODBC provider should work.  Well, since the Requester clearly wasn't working, I decided to try the more familiar Microsoft OLE DB Provider for SQL Server.


Once I switched the provider, and entered the SQL Server Name instead of the ODBC DSN, eConnect Requester worked fine.

Sure enough, the documentation for eConnect 9 does state that you should choose the OLE DB Provider for SQL Server.

It seems that there is a good reason for this!

Thursday, September 16, 2010

Is The New PartnerSource Search a Complete Dud?

Is there some secret that I don't know about actually finding something using the new PartnerSource Search?

In the last few days, I've tried to find two KB articles that I know exist.  I know their KB number, and I even know their title and several keywords.  But no matter how I try and search for them, they don't appear in the search results.

Even after filtering for Dynamics GP only, I get Solomon articles in my results.  And I'm getting a lot of hits on Great Plains 6.5 and c-tree.  But the two articles I searched for are nowhere to be found in the results.  

I eventually Googled each of the articles, and thankfully, there have been either forum posts or blog posts that reference the articles and provide a direct link to the KB.  And sure enough, the articles are still available, but only if you have the direct URL.

Is anyone else having problems finding anything using the new search?

Does anyone know of a way to revert to or access the old search that only covered support articles?


PS:  I realized that there is one way to access a KB article if you know the current KB number.  Using the following URL format, you can just change the KB number parameter.  I don't know if it will work for all KBs, but it seems to work for the two that I was looking for.

https://mbs2.microsoft.com/Knowledgebase/KBDisplay.aspx?scid=kb;en-us;855361



PSS:   Reader Perry offered this excellent tip:  In PartnerSource, you can go to Deployment --> KnowledgeBase to access the old KnowledgeBase search feature.  

The direct URL is:

https://mbs2.microsoft.com/Knowledgebase/search.aspx


Miraculously, when I type in the KB number I'm looking for, it finds it right away.  Now was that so hard?  I'm learning to appreciate the little things in life...

Identifying Orphaned SQL Server Users

Scenario 1:  Your client has a production Dynamics GP server, and decides that they want to have a separate test server for Dynamics GP.  You install SQL Server and Dynamics GP on the test server, and then you restore copies of the Dynamics database and company databases, and viola, your test server is ready.

Scenario 2:  Your client calls and frantically tells you that their Dynamics GP SQL server has died.  Fortunately, and to your surprise, they actually had good database backups from the prior night.  You replace the failed hard drives, rebuild the server, install SQL, and restore the Dynamics and company databases.

In both of these scenarios, what is a common administrative hassle that you will likely eventually run into?

Well, when you restore a database from SQL Server A onto SQL Server B, the database users will not have corresponding SQL Server logins.

For Scenario 1, there is a very handy "Capture Logins" script provided in KB 878449 that allows you to transfer the SQL logins to a new server, saving alot of time and hassle deleting users and recreating users.

For Scenario 2, if you or the client don't regularly run the Capture Logins script (I haven't met anyone who does, although it probably isn't a bad idea), then you will have a fresh SQL Server with no logins, and databases with a bunch of orphaned users.

This isn't earth shattering news or even a big deal, just a bit of a hassle.

But while studying for the painful SQL 2008 exam, I came across a system stored procedure that identifies such orphaned users.


sp_change_users_login @Action='Report';

And contrary to what I recently wrote about the SQL 2008 exam content being largely useless to the GP crowd, I actually used this script today!  A client's SQL server had died and been rebuilt, and we were having to reconstruct a very complex, horribly designed custom integration that required numerous SQL Server logins.  While trying to figure out which logins were missing on the rebuilt server, I remembered this command and gave it a try.

Sure enough, it produced a nice clean list of orphaned users, and I saw the one that I was looking for.

Sure, you could compare the SQL logins with the database users in each DB, but this command is very simple and very easy.  And if you have security conscious clients, they might want to run this to remove any orphaned users so that they have clean databases.

And there you have it.  The first semi-usable nugget that I've gleaned from studying for the SQL exam!

Wednesday, September 15, 2010

The Software Testing Disconnect

So you've taken some exams and become "certified"? So how much better prepared are you to handle those complex client requirements that invariably come up?

I've been in conference rooms with teams of very smart people, and all the certifications on the planet wouldn't have helped us with certain basic, fundamental requirements that were both challenging, and which pushed well past the limits of certain areas of Dynamics GP.

While I am a big proponent of certification, the limitations of the exam process are still frustrating to me.

I bring this up because I'm studying for my third exam to update my GP certifications.  GP 2010 Install & Config felt pretty relevant.  There are a lot of valuable real world topics and knowledge that aren't included in the exams, but there's only so much you can cover in 75 questions based on the instructor led training materials.

The GP 2010 Financials exam felt, well, a lot like a GP 10 Financials exam.  I wasn't able to tell if the exam had even been updated, and it felt surprisingly easy.  It did cover all of the basics, but it seemed a little too basic. 

And now, the reason why I am writing about this topic:  The SQL Server 2008 Implementation and Maintenance (70-432) exam.  This exam, like it's SQL 2005 equivalent, is a doozie.  For the average GP consultant, I think this exam is out in left field.  There are certainly sections that are very relevant to the GP world (configuration, backups, maintenance plans, security, etc.), but then there is a lot of content that is so irrelevant to the GP crowd that it's painful to waste brain capacity memorizing it just for the test.

The emphasis on enterprise features is predictable--those tend to be the fancier, cooler sounding, and newer features that allow Microsoft to compete against the big competitors.  But an 8 member SQL cluster just isn't a common occurrence for GP customers.  And honestly, if one of my clients had a server cluster for GP, I would want a dedicated SQL Server expert by my side to handle those complexities.  I don't want my family physician performing my brain surgery.

Here are a few examples of questions that make my eyes roll:

What is the command to extend the expiration period for a merge replication subscription?

What @sync_type value should you use for new peer-to-peer replication nodes?


When would you use an On change: prevent policy?  When would you use On Demand?


What is the exact syntax of the BACKUP DATABASE T-SQL command?

When do you want to configure the cost threshold for parallelism, and what is the syntax for the T-SQL to do so?

What is the exact syntax of the sp_add_notifications command and parameter values? 


If you know the answer to any of these, well, move to the front of the line--you can be my DBA any day.  Most of these are not applicable to GP, and for those that are, well, that is what Books On Line is for.  Memorizing the syntax of obscure or rarely used T-SQL commands that can be performed in SQL Management Studio isn't really the value that I bring to my clients.

For the rest of us that focus more on GP, and the SQL Server features that are relevant to GP, many of these questions seem like a pedantic waste of time.

I understand that there is no longer a "SQL for GP" exam, and I can understand the desire to point everyone to a single, uniform SQL exam track, but there used to be a SQL exam for GP consultants, and I think it did a great job of covering the topics that were relevant to GP folks without battering us with features that either aren't used, or can't be used, in GP environments.

Studying for the SQL 2005 exam, I did learn about some SQL Server features that I never knew existed, but since that last exam, there have been exactly zero times that I have needed to know about those features.  Kind of interesting, but completely irrelevant.

This is an age old question that has been deliberated for years, but having to study again for the SQL exam brought it up again in a very specific way.

For those other poor souls out there also having to take the SQL 2008 exam, I feel your pain...

Jealous Analytical Accounting

While doing some testing with Purchase Orders that have Analytical Accounting codes, this message popped up when I tried to save an existing PO.

Uhhh, seems like AA is a little insecure.  "even once"?  Is that like "you don't call or write anymore"?  Someone apparently needs some attention...

Tuesday, September 14, 2010

Removing Dynamics GP companies that no longer exist

This is a common problem and easy to fix, but I had a very hard time finding the KB article and corresponding SQL script on PartnerSource, so I'm posting this for my own reference so that I don't have to dig around using the often unreliable horrible PartnerSource search.

I had a call today with a client that has over 120 GP company databases.  Because there are so many databases, and so much data, they don't maintain all of the databases in their multiple test environments.

In the test environment we were working in, there were 22 company databases setup, but naturally GP still thinks there are 120, and displays them all in the company listing during login.  Normally that wouldn't be an issue, but we were working with a 3rd party product that scanned all databases, and if a database wasn't present, an error would occur.

So this scenario is essentially the same as if a GP company database had been deleted through SQL Server Management Studio, instead of through GP.  The records for the company database exist in the GP tables, but the corresponding master..sysdatabases records and physical databases "no longer" exist.

The KB article is titled "Deleting a company in Microsoft Dynamics GP", and is supposedly KB Article 855361.  At least that's what it says on the version I found--there might be more than one version, as the client had the same script, but with a different KB number in his notes.

After more searching, I found this article on Customer Source, and this apparently identical article on PartnerSource that appears to reference the same script.  But I can't find a KB Article number.  Here is a File Exchange link to download the SQL script referenced in these two articles.

Despite my repeated attempts to search for 855361, KB855361, and even "Deleting a company", I couldn't seem to find the article through the PartnerSource search.  Fortunately a Google search turned up a link to the article, and I was able to access it directly via the URL linked above.

The Basics: Deductions in Arrears

New functionality often comes out, and I find it is ignored, or users think its "too complicated" to spend time or money on configuring. So I thought I would break down one of my favorites in recent years- Deductions in Arrears (DIA). This was part of Payroll Extensions in GP 10.0, and it makes life so much easier for payroll folks who have to deal with the impact of part time or seasonal employees when they do not earn enough to cover their deductions.

DIA will track when a deduction is not taken completely due to a shortage of wages. It will then attempt to collect the deduction until the obligation is met in subsequent payrolls. You can manage the arrears, and make manual adjustments (for example, if a portion fo the arrears is forgiven).

First, to view the DIA options, you must grant access to the alternate HRM Solution Series windows for Payrolls using Microsoft Dynamics GP>>Setup>>System>>Alternate/Modified Forms and Reports. These steps are covered in detail in the Payroll Extensions user guide.

In the spirit of keeping it simple, here are the fields available in Deduction Setup for DIA (Microsoft Dynamics GP>>Setup>>Payroll>>Deductions):
  • Allow Arrears: Mark if arrears transactions should be tracked for this deduction. For example, DIA might be used for health insurance deductions but not for voluntary United Way contributions.
  • Mandatory Deductions: These work in tandem with Transactions>>Payroll>>Mandatory Arrears, to create arrears transactions for an employee who was not paid in a payroll. For example, a part time employee has no hours to be paid but still needs to pay their health insurance premium. You can use mandatory arrears to create the arrears transaction for the employee.
  • Collect When Possible: Mark this if you want the arrears to be collected even if the deduction is not included in the payroll build. For example, if an employee has an arrears transaction for health insurance, should it be collected during a bonus payroll which does not include a health insurance deduction?

Although all of these options are available in deduction setup, they can be modified per employee as well, Cards>>Payroll>>Deduction.

So, during a payroll, arrears transactions will automatically be created for deductions that are marked to allow arrears and cannot be fully collected from the employee during the pay run. You can then view the arrears transactions using Cards>>Payroll>>Arrear Transactions. By employee, you can view the number of arrears pending including the original, collected, and remaining amounts. There is also an adjusted column, which tracks any manual increases or decreases to the arrears amount. You can use the Trx # expansion to view detailed transaction information for the selected arrears, including amounts collected and dates.

Those are the basics. In most cases, it takes less than 15 minutes to configure DIA and it saves a tremendous amount of time and manual tracking. Plus, it increases the odds that correct amounts are collected from employees as quickly as possible!

Friday, September 3, 2010

Errors from Dynamics GP Local Temp Tables

This week I received an e-mail from a client regarding an error they received when posting SOP Invoice batches.

A hundred batches a day post fine, but recently, two batches have failed with the following errors:

An open operation on table 'IV_Cost_Variance_TEMP' failed because the path does not exist.

An open operation on table 'IV_Cost_Variance_TEMP' has an incorrect record length.

My first thought was that maybe there is an issue with a third party module or GP version where GP is somehow trying to write data to a temp table that has a slightly different schema.  But this is a vanilla GP install with the proper service pack.

So today I Googled the error, and I found some puzzling blog articles that mentioned C-Tree files.  When I saw dates of 2004, I thought okay, those are just old.  But then I saw more from 2009 and 2010, still mentioning C-Tree files.

In a comment posted in 2010 to one of his 2009 blog posts, David Musgrave directs someone with a similar error to these two Microsoft blog posts:

http://blogs.msdn.com/b/developingfordynamicsgp/archive/2009/08/10/an-open-operation-on-table-temp-table-errors.aspx

http://blogs.msdn.com/b/developingfordynamicsgp/archive/2010/04/21/unexplained-temp-table-errors.aspx

As David explains in the first post, "Not all temporary tables used by Microsoft Dynamics GP are SQL tables created in the tempdb system database.  Many temporary tables used for quick processing in windows use local ctree temporary tables."

Say what???  This may be common knowledge for a lot of folks, but I naively thought that GP had been fully transitioned over to SQL Server.  Apparently there are still some vestigial pieces of code that rely on local file system temp tables.  I would have thought that the MCP at Microsoft would have squashed those rogue programs, but apparently they are still riding their Light Cycles around in the GP code.  Naturally, there are probably some performance benefits to being able to perform temp table operations locally, but I am hoping that there are plans to wean GP off of the C-Tree implementation of those tables using the file system.

As the second article by Allan Cahill explains, they found that a local anti-virus application was taking just enough time to scan the temp files that it was locking them, preventing GP from deleting them.  Excluding those files from the scan seemed to resolve the issue.

I have forwarded this information to my client, and I don't yet know whether this is the cause of my client's errors, but I found this pretty interesting.

So keep those Temp folders nice and clean!  (feels like advice for a Windows 3.1 user...)

Speed Certification Exams!

Things have been very busy lately, so I apologize for my absence (in case there was actually someone out there that noticed).

My in-laws arrived from China in July, and my wife delivered our second daughter, Sarah, on August 9th.  Let's just say that those things, on top of an ever growing backlog of work have kept me more than busy lately.

So as many of you know, Microsoft has made some changes to the partner program and partner certification requirements, and they have also released the Dynamics GP 2010 certification exams.  So this month I have to take at least 3 exams to get current on the latest versions:  GP 2010 Install & Config, GP 2010 Financials, and SQL 2008.

This morning while driving to the exam center, I happened to be on a call with my blog-mate Christina, and she mentioned that when she takes certification exams that she has taken several times before, she tries to take them as fast as she can.  Apparently this is how a long time GP guru spices up her workday?

So, finding that an interesting challenge, I thought I would give it a try.  Unlike Christina, who can ace the exams in her sleep without studying, I actually studied a few hours for the Install & Config exam since there is a fair amount of content covering features that I rarely use.  Since I was pretty confident I wouldn't fail the exam, I figured that a race against the clock might be an interesting challenge.

I sat down at the exam computer at 9:46am and tried to get past the annoying 97 pages of legal babble they make you click through before starting the actual exam.  I then jumped right in, discarding the "distractors" and finding the valid answers.  Most of the questions I knew instantly, but a few made me stop and think to try and figure out the correct answer, which made me self conscious, as I could see the clock on the wall ticking away.

The questions and answers whizzed by, and in no time, I was on question 75.  Whatever that question was (something in the "configuring reporting" section), I remember that it was one where I had to stop and try and figure it out.  So close, yet thwarted by a final question.  As soon as I got through it, glanced at the review screen to finish up one question I had apparently missed (I had only checked 1 of the 2 required answers), I pushed through the "I'm Done!!!" screens and glanced up at the clock:  10:12am.

So that was 75 questions in 26 minutes, or an average of 20.8 seconds per question.  Looking back at it, I probably could have done it faster without affecting my score much, but as it was my first Speed Exam session, I was a bit nervous that going too fast would result in stupid mistakes.

I ended up with a 93%, so I passed with flying colors (or as someone I know would say:  "You studied way too much, or you took way too long, since you only need XX% to pass.").  I do wonder which 5 questions I got wrong though, something that always annoys me with the certification exams--they provide a "score" for each section, but even if I reviewed the section materials afterward, since I don't remember the specific questions (they are all a complete blur when you do them quickly), it's difficult to know which one I answered incorrectly.

Anyway, if any of you are taking your Dynamics GP Certification Exams, let me know how you did.  And if you are a long time certified Great Plains / Dynamics GP consultant and have taken the exams many times before, give the Speed Certification game a try and let us know your time and your score!