Wednesday, August 26, 2015

Revenue Accruals in Project Accounting

The conversation about revenue recognition in project accounting for Microsoft Dynamics GP often goes like this...




"How do you do revenue recognition?"




"Oh, we don't.  Just hit revenue when we bill the expenses and fees."




"Oh, okay, so no need for revenue recognition?"




"No.  But we need to accrue at month end for unbilled."







Or, sometimes, it devolves in a conversation of how to use the revenue recognition module to handle the accrual.  Which is neither fun, nor simple. Particularly because GP revenue recognition
effectively calculates one of three ways...




1. Based on contract/project/cost category progress (either by cost, or by hours)
2. When the project is complete
3. Over time when dealing with a service fee (based on duration of service fee)




But many folks don't realize that GP Project Accounting has built in functionality to deal with accruing unbilled revenue (and therefore the offset, unbilled AR as well).




In this case, you would use a Time and Materials Project.  But the difference is that you use a When Performed accounting method.  The When Performed account method, generates a credit to Unbilled Project Revenue and debit to Unbilled Accounts Receivable when a cost is posted to the project.  The amounts posted to these accounts are determined based on the budgeted profit type and project amount/percent.  Then when the expenses are billed, the amounts move from unbilled to actual revenue and AR.  At any point, the unbilled AR and unbilled revenue represent your accrual.




Of course, this works great for expenses, but what about fees?  And the need to accrue them?  You can use this same logic, but instead of using a fee - we use a zero cost miscellaneous log.  So you set up the miscellaneous log cost category with no cost, a quantity equal to the total amount of fee (if known), a profit type of billing rate, and a billing rate of $1.  Yes, $1. So then, at month end, you can enter a miscellaneous log with zero cost, where the quantity is equal to the amount of the fee you want to accrue.  It will then calculate the accrued revenue and accrued AR for the GL posting for you.  This is particularly helpful when the accrual of the fee is not predetermined, but rather calculated based on criteria outside of the system (project milestones, etc). 



The neat part of handling the fees as zero cost miscellaneous logs is that it them goes in to the WIP queue for billing at the amount you want to bill.


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.



Saturday, August 22, 2015

Using the Dynamics GP PM00400 table for Payables Imports

By Steve Endow

I'm developing two different custom Dynamics GP Payables eConnect integrations for two different customers.  One of the shared requirements is to validate the data for each Payables transaction before it is imported.

The import will confirm among other things, that the vendor exists in GP, the GL distribution accounts are valid, the amounts are valid, and lastly, that the transaction does not already exist in GP.  So if Invoice 12345 for vendor ACME is already present in GP, the import should flag the record as a duplicate and skip it.

To check for a duplicate PM transaction, you could check the PM transaction tables directly.  While definitely possible, the downside to this approach is that an invoice or credit memo may be in one of three different tables:  Work, Open, and History (PM10000, PM20000, and PM30200).

All three of the tables could be queried with a single SQL statement, but there is an alternative.  Instead of querying those three tables, you can query the PM00400 "PM Key Master" table instead.

PM00400 should contain every voucher number and vendor document number process by the Payables module, so you should be able to query it to determine if a transaction already exists, and the table should tell you the document status (work, open, or history).


Above is a sample transaction, with each row being a snapshot of the data in PM00400 as the invoice moved from work to open to history.  Note that the CDSTATUS field changes from 1 to 2 to 3, and the TRXSOURCE value is populated once the transaction is posted.

So PM00400 can be a handy option to check whether a Payables transaction already exists in GP, and a quick way to verify the status of the document.

The Receivables module has a similar "RM Key" table, RM00401.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Wednesday, August 12, 2015

Adding a Visual Studio snippet for a C# "public string" property

By Steve Endow

When I am developing in Visual Studio and adding properties to a class, by far the most common property I add is a string type property.  When developing for GP, over 90% of my properties are strings.

While I have been using the "prop" snippet to automatically type out most of the property declaration, when you have a few dozen properties, it gets tedious and repetitive having to press specify "string" over and over.


Typing "prop" and pressing tab twice will automatically create a property line ("automatically implemented property").

But, the default data type is "int", so I usually have to press TAB and then type "string", then press TAB again to name the property.

So my typing laziness finally overcame my research laziness and I actually looked up how to create a custom snippet in Visual Studio.  It's pretty easy.  Should have done it years ago.

The location of the snippets may vary by Visual Studio version and other factors, but on my machine, the snippet files are located at:

C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC#\Snippets\1033\Visual C#

I located the "prop.snippet" file, copied it to my desktop, and edited it.

Here is the original prop.snippet.


<?xml version="1.0" encoding="utf-8" ?>
<CodeSnippets  xmlns="http://schemas.microsoft.com/VisualStudio/2005/CodeSnippet">
<CodeSnippet Format="1.0.0">
<Header>
<Title>prop</Title>
<Shortcut>prop</Shortcut>
<Description>Code snippet for an automatically implemented property
Language Version: C# 3.0 or higher</Description>
<Author>Microsoft Corporation</Author>
<SnippetTypes>
<SnippetType>Expansion</SnippetType>
</SnippetTypes>
</Header>
<Snippet>
<Declarations>
<Literal>
<ID>type</ID>
<ToolTip>Property type</ToolTip>
<Default>int</Default>
</Literal>
<Literal>
<ID>property</ID>
<ToolTip>Property name</ToolTip>
<Default>MyProperty</Default>
</Literal>
</Declarations>
<Code Language="csharp"><![CDATA[public $type$ $property$ { get; set; }$end$]]>
</Code>
</Snippet>
</CodeSnippet>
</CodeSnippets>




I edited the highlighted items, and removed the lines in red.


<?xml version="1.0" encoding="utf-8" ?>
<CodeSnippets  xmlns="http://schemas.microsoft.com/VisualStudio/2005/CodeSnippet">
<CodeSnippet Format="1.0.0">
<Header>
<Title>props</Title>
<Shortcut>props</Shortcut>
<Description>Code snippet for an automatically implemented property
Language Version: C# 3.0 or higher</Description>
<Author>Steve Endow</Author>
<SnippetTypes>
<SnippetType>Expansion</SnippetType>
</SnippetTypes>
</Header>
<Snippet>
<Declarations>
<Literal>
<ID>type</ID>
<ToolTip>Property type</ToolTip>
<Default>int</Default>
</Literal>
<Literal>
<ID>property</ID>
<ToolTip>Property name</ToolTip>
<Default>MyProperty</Default>
</Literal>
</Declarations>
<Code Language="csharp"><![CDATA[public string $property$ { get; set; }$end$]]>
</Code>
</Snippet>
</CodeSnippet>
</CodeSnippets>


Once you have your new snippet, you can go to Tools -> Code Snippets Manager -> Import.  One recommendation when you import--only select one category.  For example, either use only My Snippets, or select only C#, etc., otherwise Intellisense will detect multiple snippets with the same name.


UPDATE: While trying to perform this process on a new Server 2012 machine, I was only offered the Location choices of "My Code Snippets" and "ASP.NET MVC 4". I was unable to choose "Visual C#" like on my other server.  I figured out that I had to launch Visual Studio 2012 using Run As Administrator.  Once I did that, I saw the entire list of Location options.



With my new "props" snippet imported, I save a press of the TAB key and don't have to type "string" every time.  The word "string" is now hard coded in the output and it jumps straight to the property name.


If you code properties, I recommend looking into customizing your snippets.  And I'm sure there are tons of other cool things you are do with snippets, but for now, my typing laziness is content.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




SQL Server name limitation with GPConnNet and VS Tools

By Steve Endow

I previously wrote about a situation where the Dynamics GP GPConnNet library is unable to connect to a SQL Server instance if a port number must be specified.

This week I encountered a new limitation with GPConnNet and VS Tools.  A customer has been successfully using a Dynamics GP AddIn for several years, and they are now upgrading to GP 2015.  When they tried to test my AddIn on their GP 2015 test server, they received this error.


The error message says GP login failed.  But since this is a VS Tools AddIn that runs inside of GP after a user has logged in, that message doesn't make much sense.  We know that the username and password are correct.  Very odd.

We then noticed that the server name was incomplete.  The final "T" in the name was missing, and the value displayed is exactly 15 characters--more than a coincidence.  So it looks like the 16 character server name is being truncated to 15 characters, and that is likely the cause of the problem.

But wait!  If the server name is being truncated, then the server name would be incorrect.  And when the AddIn attempted to connect to that non-existent server to authenticate, the connection attempt would fail, right?  The error message would be different for a connection failure.

So back to the original error message.  It says "GP login failed", not "failed to connect" or something similar.  So this would seem to tell us that the connection was successful, but that the login subsequently failed.

What in the world?

But it gets better.

If the customer logs in to Dynamics GP using the 'sa' login, the AddIn works and does not give the "GP login failed" message.


So the sa account works, but GP users don't work.  What does that tell us?  In theory, it is a confirmation that the AddIn connection process is working, but that there is something about the GP logins that is failing.

So why would sa work, but not a GP user login?

My guess is Dynamics GP password encryption.

When you create a new user in Dynamics GP, it "encrypts" the password before sending it to SQL Server.  This prevents a user from connecting directly to the SQL Server.

My guess is that GPConnNet uses the SQL Server name in the "encryption" process, but it is truncating the server name at 15 characters for some reason, and that is the cause of this issue.  Presumably Dynamics GP does not do this, since my client is able to login to GP just fine.

So how do you work around this issue?

The best option is to make sure that your SQL Server instance names are no more than 15 characters.

The only other option I was able to come up with was to have the client create a shorter SQL Server Alias.  I then had to hard-code that shorter alias name in my AddIn.  Once I hard coded the shorter alias for the server name, the AddIn worked fine.

Why hard code, you ask?

Well, VS Tools uses the Dynamics GP Backup / Restore form in order to get the name of the SQL Server.  Even if the Dynamics GP ODBC DSN is set to use a short alias name, the Backup / Restore window will return the actual SQL Server name.  So even after the Alias was setup and the GP ODBC DSN was using it, my AddIn was still receiving a SQL Server name of MCCGP15DB01-TEST, and the login would still fail.  Fortunately, they only have this issue with their Test database server--their GP 2015 production SQL Server has a shorter name.

So, like I said, just make sure your SQL Server instance names are 15 characters or less if you are using GPConnNet.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Wednesday, July 22, 2015

Dynamics GP VS Tools Reference Quirk: Microsoft.Dexterity.Bridge

By Steve Endow

I'm developing a Dynamics GP VS Tools AddIn and noticed an odd behavior.

I was trying to access the Dynamics Globals object properties to get the current company ID and system database name and store it in my Model object.

Controller.Instance.Model.GPCompanyDatabase = Dynamics.Globals.IntercompanyId;
Controller.Instance.Model.GPSystemDatabase = Dynamics.Globals.SystemDatabaseName;

I have these two lines in another project, so I copied them to my current project.


Intellisense recognized Dynamics.Globals and let me choose the two properties, but I was getting an error about type conversion to a string.

Since I have used these exact lines previously, I suspected something wasn't right with my current project.

I had a reference to Application.Dynamics, and I had this using statement:

using Microsoft.Dexterity.Applications;

Since Dynamics.Globals was being picked up by Intellisense, it seemed like my references were okay, but obviously something wasn't quite right.

Another odd thing I noticed was that if I typed a period after SystemDatabaseName or IntercompanyId, I wasn't getting an Intellisense pop up of options.


So something was wrong--clearly Visual Studio wasn't able to determine the data types for those properties.  I was able to use String.Convert to bypass the error, but it bugged me.  It seemed like there was some type of issue with my Application.Dynamics reference.

After checking my other code and trying various things, I finally stumbled across the solution.

I needed to add a reference to Microsoft.Dexterity.Bridge.


Once I added the Bridge reference, Intellisense stopped complaining about the type conversion, and I was able to get Intellisense values for SystemDatabaseName and IntercompanyId.


Only after looking at it again today did I realize that a big clue was staring right at me.


The error was indicating that it was a Dexterity Bridge data type, but I didn't think to look at that detail, and probably only in hindsight was this clue helpful.  But it explains why Bridge is required, and now I know to reference both libraries!

Happy bug hunting!


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Friday, June 5, 2015

Mekorma MICR, Where Did My Stubs Go?

Another interesting case this past week related to Mekorma MICR Checks.  A lot of our clients use this ISV solution, which works great (admittedly we sell it often for the benefits beyond the MICR line, in some cases to clients who don't want to even print the MICR line).

This particular case is regarding a version change, we were applying the latest service pack to a client for GP 2013.  So it would move them to GP 2013 R2.  Well, with this came a change in how Mekorma MICR stores the path to the stubs library. In the original versions, paths were specific to a workstation/install.  But in GP 2013 R2 and later, the paths are a global setting (which is definitely a good thing).

In this case, the client had two machines- a SQL server and a terminal server.  And there was a GP shared location where all of the normal reports and forms dictionaries, amongst other shared resources, were stored.  The update was applied to the SQL server first, whose stubs library was pointed to the GP share.

The issue came after we updated the terminal server and found that our modified stubs were missing!  Well, as it turns out the terminal server was actually pointed local for the stubs library.  So when the global was set to the shared location (where there were stubs, since the SQL server was pointed there previously), the terminal server was no longer viewing the previously modified stubs.

Easy fix fortunately, we located the stubs on the terminal server and copied them over to the share. But an important point to note, especially if you have numerous workstation installs as well.  We are careful to check for local reports and forms dictionaries, but we are now going to check for local stubs libraries as well when applicable.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Interrupting Printing Edit Lists

Once is a fluke. Twice makes you think about it.  Last month we had a client who had a large batch of sales transactions (think 1,000s of transactions).  They printed an edit list, and the system locked up.  So what  did they do?  Something they thought would be benign...they used Ctrl-Alt-Delete to end their GP session.  And here is where it goes awry...


When they logged back in to GP, it told them that there was a batch in batch recovery. Okay.  Fine.  So they go to recover the batch, and are MORTIFIED when it goes ahead and POSTS THE BATCH.  Before they were ready, before they had completed some additional steps to their own process.  Ugh.  Double Ugh.  Triple Ugh.


So we worked through restoring a backup, and I admittedly did not think too much about it other than it being an odd fluke.  Until it happened again, to a different client.  And then I got to thinking how it makes PERFECT SENSE.  Yes, perfect sense.


Printing an edit list uses the same report as printing a posting journal, so presumably it changes the  batch's status when printing.  So it would make sense if that process is interrupted, that the status in the SY00500  table for the batch might indicate that it was indeed in the posting process.  So batch recovery would take that information and act on it.


So what to do?  Rather than use batch recovery in this instance, I would recommend using the batch stuck scripts available in this KB article:


https://support.microsoft.com/en-us/kb/850289


Make sure to use the "Let Me Fix It Myself" option of actually running the scripts to reset the batch status and make it available for continued review/editing.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a senior managing consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.