Monday, December 29, 2014

Dynamics GP DPS: Error initializing Process Server

By Steve Endow

I know that very few customers use the Dexterity Process Server (or Distributed Process Server?) (let's just call it DPS), but I thought I would document this in case some other lucky person runs into this issue.

When I launch DPS.exe and open a Dynamics.set file, I get the message "Error initializing Process Server".


DPS then crashes with this "BEX" error.


I tried rebooting the server, but no luck.

After scratching my head for several minutes, I was almost ready to give up and try reinstalling GP when I thought of one last thing.

The server I was using has User Account Control (UAC) enabled at the default value.  So I tried launching DPS.exe using Run As Administrator.


Fortunately, that resolved the error!

I have seen so many strange errors and unexplained symptoms that were ultimately resolved by using Run As Administrator or disabling User Account Control.

While I understand the concept of UAC, the implementation and side effects are far from ideal.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Monday, December 22, 2014

Why I replace my computers every 3-4 years, whether they need it or not

By Steve Endow

In August, my trusty old file server started acting flaky.  Given the symptoms (losing BIOS settings after reboot), I assumed the motherboard was dying.  Since the computer was several years old and had served several roles, I wasn't surprised it started to have issues.  Rather than bothering to diagnose the problem, I decided to retire the computer and replace it with a Synology DS1813+.  That has turned out to have been a very good decision.  The Synology has been fantastic and has many more features than I had available on the old Windows file server and is incredibly fast.  And the maintenance on the Synology is virtually zero, including automatic software updates and reliable backups to external drives.

Fast forward to Thanksgiving 2014.  While having a family dinner at my grandparent's house, my grandfather's ancient Dell Optiplex (purchased in 2008!) decides to die.  After reviewing the symptoms, my guess is that the power supply died, with a possible side dish of a bad BIOS battery.  So I get the "honor" of bring the computer home with me to repair, along with Thanksgiving leftovers.  Lucky me.

Having been down this road before, I happen to have a power supply tester.  Sure enough, the PSU on the Dell was dead and would only blink power for a fraction of a second before dying.  I cannibalize the PSU from my old file server, which tests out fine, and install it in the Dell.  That gets the Dell to boot up, but it fails to boot into windows--a blue screen flashes for a millisecond, and then the machine immediately reboots.  The bad power supply apparently caused an issue with the drive, and no matter what I tried, I couldn't get Windows XP to reinstall or repair or boot.  So I have to install Windows 7 on a new drive.  Fortunately I'm able to read the drive and recover all of the data, but at that point I had wasted several hours trying to get the old drive and Windows XP to work.



Since I had pulled the replacement power supply out of my old file server, I started to look at the innards of that older machine, which was having problems booting up and losing its BIOS settings.

I happened to glance at the BIOS battery on the motherboard and noticed something strange.


Sure enough, there was corrosion on the BIOS battery!  The battery was completely dead, which would explain why the machine would lose its settings with every reboot.  Apparently the motherboard does not have a feature to detect a bad BIOS battery.

So a $2 battery caused my file server to become unreliable.  The machine was pretty old, so it was due for replacement, and this is a great demonstration of why old computers are often not worth the potential risk and hassle.

My current HyperV server is 3 years old now and is running fine, but I have several mechanical drives in that machine that are over 6 years old.  I'm now replacing those mechanical drives with Samsung 850 Pro solid state drives, since I don't want to have to deal with a drive failure.

And I suspect that I will be replacing that HyperV server next year when it turns 4, just to avoid potential issues.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter








Friday, December 19, 2014

Versioning your Dynamics GP Customizations and Software

By Steve Endow

Earlier this year I worked with a third party product for Dynamics GP.  I believe it is all .NET, so it had a few DLLs in the GP AddIns folder, and a few additional DLLs that were used by an external application independent of GP.

While testing, we identified a few bugs.  While communicating with the ISV, we were asked which version we had installed.  I checked the DLLs and found the following version number.

10.0.0.0

Hmm, that's a little unusual, but okay, I told them we were using version 10.

Their reply was not THAT version number, but the date time stamp on the DLL files that we had installed.

Say what?

They asked us to ignore the actual version number on their software and report the date and time that was showing for their DLL files.  For their commercial software product.  That they sell.  As a software development firm.

Let's just say I was surprised.  Any developer that has used Visual Studio should be familiar with the window that lets you set and increment version numbers.  And having developed, installed, and used hundreds and hundreds of software applications, I thought that software versioning was a no brainer and one of the most basic organizational practices that a developer should be using.

To this day, that ISV still refuses to use version numbers on their software.  Dozens of releases and updates and bug fixes and changes later, their files are steadfastly stamped with version 10.0.0.0.  We never know what "version" we are using, and every time we have a support case with them, they ask us what time stamp we have, and their only solution is to try and send us files with a newer time stamp.  I wish I were kidding, because it's a confusing mess to deal with updates from them.

So that's a great lesson on how not to (not) use versioning for your Dynamics GP customizations, integrations, VS Tools AddIns, and software.

So how should you version your Dynamics GP customizations or software?

First, let's agree that you should definitely version your software.  Any method is better than no method.

As for version numbering, different people and organizations have different practices, but here's the system I've adopted.

I typically use a format of "1.23".  Even though Visual Studio has 4 separate values, I combine values 2 and 3 to make the number more readable, and don't typically use the 4th value.   I like 1.23 much better than 1.2.3, but as you'll see shortly, this has a potential downside.

I start with a version 1.00.  If I have some minor bug fixes, I'll increment to 1.01, 1.02, 1.03, etc.

If I add a new feature or fundamentally change some functionality, I'll increment the second digit, such as 1.10, 1.20.

As the version numbers increase, I personally try and avoid going above 9 for the second and third digit.  I did it once, and it ended up being quite confusing.  Since I combine the 2nd and 3rd version segments, how do I represent version 1.1.0 vs. 1.10.0?  If I use 1.10, do I use 1.100 for version 10?  It doesn't work well with my typical version numbering.

When the customer upgrades to a new version of GP, I'll typically increase the major version number, so I'll go from 1.23 to a fresh 2.00.

However, in cases where I am developing software for use with multiple GP versions, like the software that I sell, I now include the GP version in my software version number.  So I'll have 10.1.23, 11.1.23, and 12.1.23.  That way I can know that the customer has the correct version for their version of GP, and during support calls, I can see my version number and know which version of GP the customer is using without having to ask or look it up.

In a new twist, last week I released a web service that integrates with GP.  As the customer tested, they wanted rapid bug fixes, so I would fix a bug or two and push a new release multiple times in a single day.  My version numbers were quickly escalating, so I enlisted the fourth version number in Visual Studio.  So for minor updates or bug fixes, my version numbers would increment like 2.23.1, 2.23.2, etc.  It has worked well so far and hasn't been inconvenient or confusing.

But even with these frequent updates, I always increment the version number.  It keeps things organized, avoids confusion, and every customer and consultant I have worked with understands the concept of version numbers, so the practice makes communication about updates and new versions easier.

For customizations or integrations with a visible window, I always display the version number on the form.  Never make your customer search for a version number when they need support or when they need to check that they are on the latest version.  Just always add it to at least one of your forms or to Help -> About--it takes 2 minutes and it will make life easier for everyone.


In this case, the form is displaying 12.1.2.5 because that is the actual version value from the .NET application, but I would refer to it as version 1.25 for GP 2013.

For Modifier and VBA customizations, I have sometimes added the version to the GP windows.


So far I've accomplished this with a simple text label in Modifier that I have to manually change with each release. It isn't fancy, but it's simple and easy.

Whatever you do, please, please, please do not rely on file date time stamps to version your software.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


eConnect mystery: Imported PM Manual Payments suddenly have no distributions

By Steve Endow

I have a Dynamics GP customer that uses my AP Payment and Apply Import.  It has been working fine for over two months, but suddenly this week when the customer noticed a problem.

The PM Manual Payment transactions would import without error, but the transactions didn't have any distributions.  When they viewed the transaction distributions in GP, the distribution scrolling window was completely blank.

Hmmm.

The manual payments are not receiving distributions as part of the eConnect import--the import is letting the cash and payables values default, so it isn't a bug in the import.

The GP partner said that the only event they can think of was that the company database was restored recently.  That alone shouldn't cause any issues, so I couldn't of think how that might relate to the issue, but it's a potential clue.

We reviewed the default GL accounts for the vendor, checkbook, and company, and default accounts were present in GP.  The partner also checked the eConnect event log, but there were not errors or clues related to the missing distributions.

As a test, the partner entered a manual payment into Dynamics GP to mimic the imported transaction for the same vendor and checkbook, and that transaction was fine.  The distributions defaulted properly.

They then tested the import on another company.  The imported payments were fine--they had distributions.

We then turned off the eConnect service and ran the import.  This forces an error to occur and has the import display the resulting error message, which includes the full XML document that was sent to eConnect.

The XML looked just fine, and was identical to the values I got when I tested successfully on my TWO / Fabrikam environment.

So at this point, we believe that the issue is specific to one company and believe that the issue is not related to the import application.  It seems to be an eConnect issue in the company database, but we haven't yet tracked down the cause.

My next recommendation is to query the PM10100 table to see if any distribution records are being created by eConnect.  I'm wondering if eConnect is importing invalid distributions, which are not displaying in GP even though they are present.

Another step they can take is to recreate the eConnect stored procedures, which Mariano discussed on his blog.  But I don't know if that will replace a modified procedure or resolve whatever issue might exist with eConnect.

My last option is to review the taPMManualCheck and taPMDistribution stored procedures to figure out how default distributions are generated.  We can then query the same default account values in GP to see if there is some bad value in the company database.

In the 8 or 9 years that I've worked with eConnect, I haven't seen anything like this, so I'm puzzled and intrigued.

I'm sure there is an explanation, but it looks like we're going to have to work for the answer to this one.  My guess is that there is some issue with the default account values, but I just can't imagine what scenario would cause this problem.

I'll post an update if I learn more or if we figure out the cause.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Tuesday, December 16, 2014

C# vs. VB for Dynamics GP .NET Development

By Steve Endow

At the reIMAGINE 2014 conference, Andrew Dean and I gave a presentation on VS Tools and .NET development for Dynamics GP.

At the beginning of the session, we asked the crowd several questions to get an idea of who were consultants vs. developers, who used Dex and/or .NET, and to assess experience, etc.

One question I asked out of curiosity was, for the .NET developers, who used C#, and who used VB?

A vast majority of the room raised their hands to indicate that they used C#.  It seemed that almost all of the attendees had previously indicated that they developed with Dex, so it surprised me to see that nearly all of them also developed in C#.

When I asked about VB, I believe only 3 people indicated that they were solely VB developers.  This also surprised me.  I have known several people who were VB-only developers, so I expected more of a mix, but based on our very informal survey, perhaps 95% of the Dynamics GP .NET developers focused on C#.

Interesting...


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Buttons mysteriously resize on VS Tools Dynamics GP Forms -- VST template bug?

By Steve Endow

Sometimes strange things happen when you are developing software that can be quite puzzling.  A window may behave oddly, data may get updated in an unexpected way, or an application may crash.  I tell myself that there is always an explanation, but sometimes these things may seem nearly impossible to figure out.

Last year I developed a VS Tools AddIn in Visual Studio 2010 that worked just fine.  I later added a button to a "Dynamics GP Form" in that AddIn, and it seemed to work fine.  But after some testing, the client noted that the text on one of the buttons was cut off, and asked me to make the button wider.  I thought I had already resized that button, but when I checked it in Visual Studio, sure enough, the button wasn't wide enough.

I made the button wider, confirmed that the button text displayed correctly, and sent the customer a new version of the AddIn DLL.  A few days later, they said the button text was still being cut off.  When I opened Visual Studio, the button size had changed and was again cutting off the text.  What in the world?

After trying various things, I discovered that whenever I recompiled the VST project, two buttons on the form would resize to the default button width.  One button was wider and would shrink.  The second button was narrower, and it would get wider.  No matter what I tried, the buttons would resize after being recompiled.

I finally gave up and added some run time code that would resize and reposition the buttons.  That worked, but clearly there was some underlying issue.

Fast forward to December 2014, and I'm now seeing the same symptoms in a brand new AddIn project using Visual Studio 2013.

This is what the form looked like when I originally created it.


After doing some testing, I noticed that the button had resized and looked like this.


The first time I thought I had just forgotten to resize the button, but the second time I realized that I wasn't imagining things.  The button size was definitely changing.

But with this new project in VS 2013, the button was not resizing when I recompiled, like what I saw with VS 2010.

After trying a few things, I discovered that when I closed Visual Studio 2013 and reopened the solution, the button would resize.


While this may seem trivial, try designing a complex window with a dozen buttons that all resize every time you reopen your project.  It's a nightmare.

My only guess is that this is some type of obscure bug in the Dynamics GP VS Tools form templates, as I have only seen this issue with the Dynamics GP Form template windows.

Since I have observed that the Dynamics GP Form template has a bug with the window startup position property, I'm assuming that this is another small, but highly annoying bug in the form template.

What I need to try next is working with this project on a different server to see if it might be some problem with this development server.

Has anyone else experienced this issue?


UPDATE:  I setup VS 2013 on a separate server and am able to reproduce the issue.  I'll size the buttons, save the project, then close Visual Studio.  When I open the project and view the form, the buttons are resized.

After further testing, I found that if I just close the form designer window and then reopen it, the buttons resize.

And this issue only occurs with the Dynamics GP Form objects.  If I add a standard Windows Form to the AddIn, the buttons do not resize.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Friday, December 12, 2014

Executing SQL DDL scripts from a Dynamics GP VS Tools AddIn

By Steve Endow

I've developed many Dynamics GP Visual Studio Tools AddIns that depended on one or more SQL scripts.

Simple single-line SQL statements are easy to include in your code, but what about more complex queries?  If I have large queries that are more than a few lines, I typically try and push those out to stored procedures.  Stored procs are easier to maintain outside of code, and calling a stored proc from a VSTools AddIn is cleaner and simpler than maintaining a large query in Visual Studio.

Okay, so that may be fairly obvious, but how do you deploy those stored procedures?  Previously, I've included them as SQL scripts that had to be manually executed as part of the deployment process.  While this was simple and low tech, it is easy to forget a script or two if you have a complex deployment.

And what about Data Definition Language (DDL) scripts?  What if you need a custom table for your customization or integration, or several?  What if you don't want to have the client manually run 10 SQL scripts as part of your AddIn deployment?  And what if your AddIn needs those tables and objects to be setup every Dynamics GP company database?  And what if the client has 80 company databases?  Not a fun deployment.



After all, the great thing about AddIns is that the ideal deployment can be as simple as copying a single DLL file to the GP AddIns subdirectory.  If you have to manually run SQL scripts you start to lose that benefit.

This week I am working on a new VS Tools AddIn that will have at least 2 custom tables.  I was tired of dealing with SQL scripts as part of my deployment, so I was determined to fully automate all SQL scripts.  I want my AddIn DLL to automatically detect whether the required stored procedures and tables are present, and if not, automatically create them.  I've done this once in the past, but it was messy and a nightmare to maintain.  I wanted a better, cleaner method that was easy to maintain.  Fortunately, I found one.

I first wrote this simple method to check if a table exists:

public bool PSV10000Exists()
{
    //Check if table exists
    string commandText = "SELECT COUNT(*) AS TablePresent FROM DYNAMICS.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'PSV10000'";
    SqlParameter[] sqlParameters = null;

    string result = DataAccess.ExecuteScalar(CommandType.Text, commandText, sqlParameters);

    if (result == "1")
    {
        return true;
    }
    else
    {
        return false;
    }
}


You could abstract this to be "TableExists" and have the method accept a database name and table name as parameters, but this was my prototype.

Great, so now you know whether the table exists.  What if it doesn't?

Well, then you just execute the DDL script to create the table, right?

Nope.  Not that easy.

ADO.NET is great for straight queries or individual / discrete DDL commands, but if you have a complex statement with several "GO" commands included, it can't parse the command and will throw an error.

My table creation script has several commands, each followed by a GO command.

CREATE TABLE
CREATE NONCLUSTERED INDEX
ALTER TABLE ADD  CONSTRAINT  (several of these)
GRANT


So how do you run a multi-step script with numerous GO commands?

It turns out that there is a different "ExecuteNonQuery" command in .NET, tucked away in the SQL Server Management (SMO) assemblies.  SMO allows you to execute complex multi-step SQL scripts, including DDL scripts with multiple GO commands.

To use SMO, you first need to add four SMO references to your project.  The DLLs are available at:

C:\Program Files\Microsoft SQL Server\120\SDK\Assemblies\

And you'll need to add these references:

Microsoft.SqlServer.ConnectionInfo.dll
Microsoft.SqlServer.Management.Sdk.Sfc.dll
Microsoft.SqlServer.Smo.dll
Microsoft.SqlServer.SqlEnum.dll

Once you have those references setup, you'll need to add these statements to your Data Access class.

using Microsoft.SqlServer.Management.Common;
using Microsoft.SqlServer.Management.Smo;


With that taken care of, you can create a simple method to call the SMO version of ExecuteNonQuery.  Here is mine, which relies on the Connection property of my DataAccess class to get a connection to the GP SQL server via GPConnNet.

public static int ExecuteDDLScript(string commandText) 
{
    SqlConnection gpConn = new SqlConnection();
    try
    {
        gpConn = Connection;  //Get connection from GPConnNet
        Server server = new Server(new ServerConnection(gpConn));
        int result = server.ConnectionContext.ExecuteNonQuery(commandText);
        return result;
    }
    catch (Exception ex)
    {
        throw ex;
    }
    finally
    {
        gpConn.Close();
    }
}


So now you have all of the plumbing to execute a DDL script.  Now what?

Let's start by getting our DDL script into Visual Studio so that our code can access it.  But where do you store a 20, 30, 50 line SQL script?

I am a huge fan of the Visual Studio Resources feature, which is a fantastic place for storing and managing large SQL scripts.


You define a resource name, paste in your entire SQL script, and add some comments if desired.  I like to note the version number of the scripts, as I often have to refine them during development.

Once you have your Resources setup, you can then put it all together in a simple method that actually executes the script.

I decided to create a new SQLObjects class, separate from my DataAccess class.

In my SQLObjects class, I created this new method.

public bool CreatePSV10000()
{
    string commandText = Properties.Resources.sqlCreatePSV10000;
    int result = DataAccess.ExecuteDDLScript(commandText);

    if (result == -13)
    {
        return true;
    }
    else
    {
        return false;
    }
}

Look at how simple that is!  It's essentially two lines to create your table if it doesn't exist.


No manual SQL scripts required, and your AddIn can do all of the work automatically, greatly simplifying your deployment.

You may have noticed the -13 value, which is a bit odd.  In my testing, that appears to be the integer returned by the SMO ExecuteNonQuery method when the script is run successfully.  I haven't yet been able to find a list of the return codes for the SMO ExecuteNonQuery method, so I'm still researching to see if -13 is an unconditional success response, and what other values I need to look out for.

Now that I have this method setup, I look forward to never having to include SQL scripts in my deployment guides again!


UPDATE:  I looked again at the documentation for the SMO ExecuteNonQuery--specifically the ServerConnection.ExecuteNonQuery method.  Regarding the integer return value, it says:

The return value specifies the total number of rows affected by the Transact-SQL command for UPDATE, INSERT, and DELETE statements. For all other types of statements, the return value is -1.
Since I am running a DDL script, the return value should be -1, whereas I was getting -13.  Then I realized that my DDL script consisted of numerous statements separated by the GO command.  Which made me wonder if I happened to have 13 statements in my script.  I did a test where I removed all of the GO commands--my script still worked fine, and the return value was -1.  When I added 4 GO commands, the return value was -4.

So I learned that why I got a return value of -13, and I also happened to learn that all of the GO commands in a DDL script created by SQL Server Management Studio are not required.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Tuesday, December 9, 2014

Fun with eConnect and .NET decimal rounding

By Steve Endow

There are some situations where you need to import transactions into Dynamics GP, where the source data contains line item totals that need to be broken down before being imported into Dynamics GP.

Since that probably didn't make a whole lot of sense, let me give an example.

One of my clients needs to import SOP Invoices.  Due to the nature of their business, the invoice data to be imported into Dynamics GP consists of item number, quantity, and extended price.  There is no unit price in the data.

For example:

Item:  A100
Quantity: 231
Amount:  $53,981.87

To import the line item using eConnect, you need to calculate the unit price.  If you calculate the unit price for this example, you get $233.69.  Pretty simple, right?

Well, if you verify the unit price:  231 x $233.69 = $53,982.39.  Note that this is $0.52 more than the actual line amount.

Fortunately, and surprisingly, eConnect (and the Dynamics GP client) is okay with this scenario.  You can import the calculated unit price and the extended price, even though there is a discrepancy due to rounding.

Easy peasy lemon squeezy, right?

Well, if your integration was developed with .NET, there is a caveat.  When you perform the calculation to get the unit price, you will need to round the resulting decimal value.  And this can cause a problem.

Let's use a different example for this one.

Item:  A200
Quantity: 8
Amount:  $2,089

So to get unit price:  $2,089 / 8 = $261.125

And then you dutifully round that value to two decimals with this code:

sopLine.UNITPRCE = Math.Round(lineAmount / lineQuantity, 2);

And you get a unit price of:

$261.12

Say what?

Every kid knows that .125 should round up to .13, right?

So what in the world is going on?

Well, the .NET Math.Round method has two different "rounding conventions", as this article explains.

The default rounding convention is "To Even".  This means that .125 is rounded to the nearest even value, which is .12.  

If you send a quantity of 8, with a unit price of $261.12, and an extended price of $2,089, eConnect will return the following error:

Error Number = 722  Stored Procedure= taSopLineIvcInsert  Error Description = Unit Price calculation does not match out to Extended price

It took me a while to figure this out, but eConnect uses "conventional" rounding, and in this case expects $261.13.

So how do we fix this?  Well, the Math.Round method has a parameter to change the rounding convention.

Math.Round(lineAmount / lineQuantity, 2, MidpointRounding.AwayFromZero);

The MidpointRounding option allows you to tell .NET to use the more conventional rounding method which Microsoft calls "Away From Zero".

This will produce the desired result of a unit price of $261.13 and eliminate the somewhat perplexing eConnect Error Number 722.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Wednesday, December 3, 2014

Diagnosis: Developer's Amnesia

By Steve Endow

Many years ago I was part of a corporate web development team that maintained the internal and external web sites of a Fortune 1000 corporation.  We did classic ASP web development, with projects like product catalog pages on the external web site, registration web sites for customer conferences, and internal document management and knowledge base sites.

One Monday, one of the business analysts I had been working with stopped by my gray corporate cubicle and asked if I had completed the XYZ web page. (I don't recall what exactly it was)

"The what?", I replied.

"The XYZ web page.", she repeated.

I furrowed my brow, looking at her like she might be talking to the wrong person.

"XYZ?", I asked, clearly puzzled.

"Ya, what you worked on last Friday", she slowly said, picking up that I had no idea what she was talking about.

"When?", I asked, starting to get concerned that one of us might be crazy.

"Friday.  Remember Friday?", she said, starting to get concerned.

"Um...", I said as I paused and thought.  "Um, Friday...ya, I've got nothing" I admitted.

She then proceeded to explain that she had talked to me on Friday afternoon and asked me to make some change to a web page.  It was a last minute thing, and I was the only one around, so we apparently talked about the change that she needed done ASAP.

After hearing the story, I turned to my computer, looked up the web page, checked the code, and sure enough, there were my changes, just as she had requested.  I had apparently completed the change that she requested.

I stared at the code, which I clearly wrote, with zero recollection of making the change and without recognizing the code.  I made the change, but as far as I was concerned on that Monday, I was staring at someone else's code.

Slowly it came back to me.  Friday afternoon, looking to go home, and I had to quickly make a change to a web page.  I then vaguely recalled the conversation with her just over 2 days ago.  I recalled the general request, but had no memory of any of the details.

At the time, I chalked it up to doing the work on a Friday afternoon and then having a great weekend where I didn't think about work for 2 seconds.  (Oh, how I miss those...)

But I now think that scenario is a symptom of what I'll call "Developer's Amnesia".

This came up because I was working on a Dynamics GP integration yesterday.  It was a modification to an existing integration I had developed between a web site and Dynamics GP, and I needed to import an AR cash receipt.

I emailed my colleague on the project, asking what values should be used for the cash receipt type, credit card ID, and checkbook ID.  She replied, "Why don't you just use the same values that you used on the other cash receipt import that we did for the client?".

"Huh?"  I re-read her email, thought about it for a few seconds, and a few details slowly started to slugglishly bounce around my brain.  "Um, ya, sure, that makes sense, forgot about that!" I sheepishly replied.

Then I connected to my development server and pulled up the Visual Studio code for that project.  I checked the date on the files, and saw that I had created the last project 7 weeks ago, and last updated it 5 weeks ago.  Somehow, in those 5 weeks, my memory of the project had been wiped.  Totally wiped.

I definitely remembered the project--the name and what it did in general, but somehow I forgot that it involved an AR cash receipt.  I pulled up the code and reviewed it, and I felt like I was having an out of body experience.

I was looking at the code, but I didn't recognize it.  At all.  Did I write this?

I slowly remembered some of the objects and methods, but it was a general vague recollection.  When I reviewed some specific lines and object properties, I had zero recollection of the details.

The funny part was that the code was beautiful!  "Wow, this guy can code!  Who is this C# genius?"  It has clean objects, clean methods, and an elegant design.  But there was enough complexity that in a few areas it wasn't obvious how some things worked.  "Hmm, this guy needs to comment his code better!" I thought ironically.

Apparently the code was all pretty obvious at the time, but since then, I clearly had been stricken with Developer's Amnesia,   After 30 minutes of working with the code, I'm still piecing it all back together.

I emailed a developer friend, explaining what just happened and asking him if he has ever had such an experience.  He quickly replied that he's had the exact same experience, several times, and that he knows exactly how I feel.  Well, at least that is somewhat comforting, knowing it isn't just me.

I then recalled that conversation I had over 10 years ago with that business analyst, which made me wonder how it was possible that I could forget something, that involved thousands of lines of code, so completely.  10 years ago I was that much younger, so I can't blame age.  And there are tons of useless trivial details that I remember from other projects I worked on years ago, so I am pretty sure I'm not forgetting everything.

My only guess is that I experience Developer's Amnesia when I complete a project quickly and have little or no follow up work.  My theory is that if I only spend several intense days working on something and slam it out, the project never transfers from my short term memory to my longer term memory.  Once I release the project and it is deployed smoothly without any follow up work, I move on to the next project, and my short term memory of that project gets wiped.

For projects that I work on part time over the course of a few weeks, more context develops and there is more time for the project to get transferred to my long term memory.  While I obviously can't remember every detail, I can recall most of the project.  There are long term projects I've worked on where I remember a surprising number of details, to the point where the customer emails me to ask how an integration works, and I explain the details that we discussed 3 or 4 years ago.

Whatever the reason, it seems that there are times when developers just completely forget some project or some code that they've written.  So when you ask a developer a question and they look at you funny and act like they think you're making things up, they might just have Developer's Amnesia.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Wednesday, November 26, 2014

Dynamics GP JSON Integration Issue: null object property values for JSON parsed by MVC controller

By Steve Endow

I haven't yet seen code samples or API documentation for the new Dynamics GP 2015 Service Based Architecture (SBA), but I have a hunch that this might be relevant when we start working with GP 2015 SBA and use JSON to communicate with the GP SBA interface.

WARNING:  If the title wasn't warning enough, this post is a full-on geek fest, so if you aren't a developer that loves Riddles of the Code, flee now.

I'm currently working on a web service application that allows a Linux web server to integrate to Dynamics GP.  After considering various options, the client and I agreed that the simplest approach would be for me to develop a custom web application that could receive and send JSON.  The Linux developers were very comfortable with this, and after getting some help from an expert with experience developing JSON web apps in Visual Studio using MVC, I was able to create an integration that worked great.

After the release of the initial version of the web integration, I noticed a quirk about how blank JSON values were being received by my code.

If the client sent a JSON parameter value that was blank, that particular property on my object would be null.  So if he did not supply a Address2 value, I would get Customer.Address2 == null.

I thought that it was due to how I was defining my objects in Visual Studio using the simple property definition style, such as:

   public string Address2 { get; set; }

Since I wasn't defining a default value for the property, I thought it made sense that I was getting null, assuming that the .NET MVC JSON parser was skipping the empty JSON parameters.  Seemed to make perfect sense at the time.

I didn't think much about it, and just added some workaround code to check for null values and set them to empty strings.  In hindsight, I now know that was sloppy, but since I didn't realize the underlying problem, it was a simple fix and seemed to worked.

Now, I'm adding some additional functionality to the web application, as the customer wants to have new features, such as the ability to search for GP customers.  This time around, I'm more familiar with web app development and MVC, so I was a little more observant about the null property issue.

While testing my recent enhancements, I once again noticed that when I submitted a request to my web app, blank JSON values were being translated to null property values on my objects.  Still thinking that this was due to my object property definitions, I modified my object to use a "full" property definition, like so:

private string address = string.Empty;
public string Address
{
    get { return address; }
    set { address = value; }
}

When I walked through the code in debug mode, what I found surprised me.  Even though my property was initializing properly to an empty string, the MVC JSON parser was subsequently setting the property value to null.

So the null values weren't due to my property definitions!  So why was I getting a null, and how can I fix this?

I didn't even know how to phrase a search, but after a few random search attempts in Google, I found this Stack Overflow thread that appeared to exactly describe my scenario:

http://stackoverflow.com/questions/12734083/string-empty-converted-to-null-when-passing-json-object-to-mvc-controller

It appears that with MVC version 2.0, Microsoft changed the way that JSON is parsed.  Instead of assigning empty strings, the newer MVC versions assign null.  Here is a post that covers some of that history:

http://brianreiter.org/2010/09/16/asp-net-mvc-2-0-undocumented-model-string-property-breaking-change/

Okay, great, so how do I "fix" this "problem"?

The Stack Overflow post has several suggestions, none of which I really understand.  I understand the concept that it changes how the JSON is parsed, but the actual code is currently Greek to me.

I ended up trying the suggestion to create a new class called EmptyStringDataAnnotationsModelMetadataProvider.  I then modify my Global.asax file and add the single line to the Application_Start() method.

ModelMetadataProviders.Current = new EmptyStringDataAnnotationsModelMetadataProvider();


I am using MVC 4.0, and this solution worked for me.

Now, when I pass in JSON with blank parameter values, my string properties are empty strings and not nulls.  You know you are a geek when you get excited about an empty string.

It was a bit of a journey, but this obscure change should make my life easier.

And I suspect that once we start developing against the GP 2015 Service Based Architecture, this may come in handy if GP returns empty JSON parameters.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter






Monday, November 17, 2014

Unable to read unsaved Cash Receipt Document Number using VS Tools

By Steve Endow

I have a customer that issues thousands of invoices to a handful of customers.  So when they receive a payment from a customer, they have to apply the payment to hundreds or thousands of invoices.

To accommodate this high volume / bulk payment apply process, I developed a custom AR apply window.  The window works well on its own, but the customer wanted the ability to launch the custom apply window from the Cash Receipts Entry window, immediately after they enter the payment.

When the user launches my custom apply window, they wanted the customer, payment doc number, check number, and payment amount from GP all pre-populated so that they could immediately start applying the payment, which makes complete sense.

No problem, I thought.  Easy peazy lemon squeezee, right?

Unfortunately, no.

In my VS Tools AddIn project, I added the Menu Handler to the Cash Receipt window so the user could open my custom window.  Very straightforward and worked fine.

I then added some code to read the field values from the Cash Receipt window so that I could populate the fields on my custom apply window.  That was simple as well and everything looked good.

But when I tested the code, it didn't work.  My window would launch, but the fields on my form weren't populating.

In Visual Studio, I attached to the Dynamics.exe process and started debugging.  I found that an If statement was being skipped:

if (rmCashReceiptWindow.CustomerNumber.Value != string.Empty && rmCashReceiptWindow.DocumentNumber.Value != string.Empty && rmCashReceiptWindow.CheckNumber.Value != string.Empty)

That was odd, since I had populated all of the fields on the Cash Receipt window.


The customer number, document number, and check number were all populated.

I checked the field values in Visual Studio, and to my surprise, the DocumentNumber field value was empty.


I could read the Check Number and Customer Number just fine, but the Document Number field had no value, even though it was displayed in GP.

Um, what gives?

Just to make sure I wasn't making a silly mistake, I fired up Andrew Dean's very cool VSTools Object Explorer that we demonstrated at our reIMAGINE session last week.

Sure enough, the Object Explorer confirmed that the Document Number field had no value, and that the value wasn't stored in any of the other fields accessible through VS Tools.



This confirmed that with an UNSAVED cash receipt, the Document Number field value was not accessible in VS Tools.

But after I saved the cash receipt, then pulled the receipt back up in the Cash Receipt window, I could read the Document Number value in VS Tools.


Not sure what is going on.  My two wild guesses are that during data entry, the field value is being stored in some temporary field--which seems really strange.  Or, the VS Tools field value is somehow wired up to the underlying Dex table buffer, and that table buffer value is not populated until the transaction is saved.  Which seems like a stretch as well.

I'm going to dig into it further and see if other windows have this issue, but this definitely complicates things if you are trying to trigger events in VS Tools that depend on the GP document number during data entry.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter





Saturday, November 15, 2014

Change the name of a Dynamics GP Web Client server

By Steve Endow

Several months ago I setup the GP Web Client on a spare GP 2013 R2 server that I had used to test Scribe.  The server was named Scribe, but I wasn't using Scribe any more, and will be using this server to work with the Web Client, so I wanted to change the name of the server.

I had a feeling that changing the server name would be a bit of an off-roading adventure, and sure enough it was.  But with some fiddling, I think I eventually figured out the process to change the server name and reconfigure all of the GP Web Client components to get everything working again.  At a minimum, it was a fun exploration that helped me learn a little more about the innards of the GP Web Client server configuration.

I assume this will be a very rare situation, but since I went through the process, I figured I would document it in case any other poor soul decides to change the name of their GP Web Client server.

The one caveat about this process is that I have a single server deployment.  If you have a multiple-server deployment, you may need to complete a few additional steps on the additional servers.

And I would only recommend this process for a test server.  I haven't yet vetted the process, so I would not recommend this for a production server.


  1. Change the computer name and reboot server
  2. Update the Dynamics GP ODBC DSN with new server name
  3. If using a self-signed or server-specific SSL cert, remove the SSL cert from the IIS site Bindings window, then create and install a new SSL cert
  4. Modify the server name in Microsoft.Dynamics.GP.Web.ConfigurationWizard.config file (C:\Program Files\Microsoft Dynamics\GP Web Client)
  5. Launch SQL Server Management Studio and login using the new server name  (I don't know why, but this seemed to resolve a subsequent SQL connection issue during the repair process)
  6. Open Control Panel -> Programs
  7. Select Microsoft Dynamics GP Web Client and click on the Change button
  8. Click the Repair button
  9. Complete the GP Web Client repair / configuration windows
  10. Open a command prompt and run iisreset to restart IIS
  11. Restart the GP Session Central Service and GP Session Service
  12. Open a web browser and try logging in to the GP Web Client to confirm the web client works
  13. Connect to the Web Management Console (https://ServerName:PortNumber/WebManagementConsole) and update the server name for the Session Central Service 
  14. In SQL Management Studio, navigate to the GPWEBCLIENTSESSIONCENTRAL database
  15. Remove any records from the SessionStatus table
  16. Remove the old server record from the SessionHostStatus table
  17. Remove the old account records from the ServiceSecurity table
  18. After these changes, my old Dynamics GP logins would no longer work.  I had to re-enter my GP registration keys and then reset the passwords of my additional GP users.  I don't know if this was due to changing the DSN, or due to other aspects of the changes.


If anyone else is lucky enough to have to complete this procedure, please let me know if I missed any steps or if you have to do things differently than me.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Consultant Toolkit: How to make friends at a tech conference (USB Battery Chargers)

By Steve Endow

Once a year, Fargo, North Dakota becomes the epicenter of the technology world as the smartest (and best looking) people on the planet all converge at the most scenic and bucolic city in America for a Dynamics GP partner conference.  This year, it was for the reIMAGINE 2014 conference.

When I have flown from Los Angeles to Fargo the last 3 years, I've had a stop in Minneapolis St. Paul--an airport that must have been going for a world record in walking distance between terminals.  Since many of the Fargo flights are routed through the MSP airport, it's become a spot for me to meet up with colleagues on the way to Fargo.  Last year, my blog buddy Christina Phillips and I coordinated our Fargo flights, so we were able to grab lunch and catch up at MSP and then take the same plane into Fargo.

This year, while I was waiting at the gate, my friend Tanya from S2 Technology just happened to walk by.  Sure enough, we were on the same flight to Fargo.  As soon as she sat down, she had to find an outlet and start the famous airport charging routine.  Yup, she was a wall hugger.  She had to charge her laptop, and then she had to plug her phone into her laptop to charge that too.

I admit I've been there before.  But earlier this year I was introduced to a nifty gadget:  the USB battery charger.


It's very simple and somewhat obvious, but brilliant.  Just a battery that you can charge via USB, and then use to charge your phone or tablet or other devices that charge via USB port.

They come in all sorts of shapes and sizes depending on what you need to charge and how much capacity you need or want to carry around.

A friend gave me a small version like the one pictured above, and after seeing how handy it was, I bought a slightly larger version made by Jackery.


I dug out my Jackery charger and gave it to Tanya to test.  During our wait for the flight, it did a pretty good job of charging up her phone and had plenty of capacity left.  I think she was sold on the idea.

During our mind-blowingly awesome Pushing the Boundaries of .NET Development presentation, Andrew Dean and I gave them away as prizes to the attendees of our session--who I'm told were the smartest people at the conference.

And throughout the conference, I was able to charge my phone and tablet without having to ever look for a wall socket.

And on the last night of the conference, Andrew's USB phone charger stopped working, leaving him with a dead cell phone.  I gave him my USB charger and his phone got fully charged overnight--a tech rescue.


So if you want to stop being a wall hugger, check out USB battery chargers.  They are inexpensive and can help you make friends at tech conferences.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Troubleshooting GP Web Client Error: An unexpected error has occurred. Press the Sign-in button to reconnect to the application.

By Steve Endow

Others have posted articles about troubleshooting issues with the GP Web Client (such as Mariano's comprehensive series here), but since this is the first time I've done the troubleshooting, and since my problem was pretty simple, I thought I would document my experience.

I have a server with the Dynamics GP Web Client installed and configured.  It worked fine back when I performed the initial install, but it has been a few months since I've used the Web Client.

While trying to setup the new GP 2013 R2 workflow features today, I tested the Web Client and received this error in Internet Explorer:

Unexpected Error
An unexpected error has occurred. Press the Sign-in button to reconnect to the application.
Correlation ID: a238ed59-ea0e-46f7-b4ef-482f2e370372 

This is a generic error, so the error message itself doesn't offer any clues.

One thing I recently learned at the reIMAGINE 2014 conference is what a "Correlation ID" is.  That unique ID is assigned to a web client process and allows you to trace a process in Event Viewer and other logs when testing or troubleshooting.  If you have 20 people using the Web Client and need to troubleshoot an error for one user, the ID lets you identify errors or messages for that particular user.  So while that ID can be helpful in troubleshooting, it doesn't have any meaning or explain what caused the error.

To learn more about the error, I opened Event Viewer and navigated to the Dynamics application log.  There I found a pile of shiny red errors.


Starting at the top, I reviewed this error.  Notice the GUID at the beginning of the error--this matches the Correlation ID I saw in Internet Explorer, telling me that this error is related to my browser session.
a238ed59-ea0e-46f7-b4ef-482f2e370372:An unexpected error has occurred.  Press the Sign-in button to reconnect to the application.::System.Web.HttpUnhandledException (0x80004005): Exception of type 'System.Web.HttpUnhandledException' was thrown. ---> System.ServiceModel.EndpointNotFoundException: There was no endpoint listening at https://webclient:48650/SessionCentralService that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details. ---> System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it [2001:0:5ef5:79fb:8fb:201f:3f57:e65b]:48650
   at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)
   at System.Net.ServicePoint.ConnectSocketInternal(Boolean connectFailure, Socket s4, Socket s6, Socket& socket, IPAddress& address, ConnectSocketState state, IAsyncResult asyncResult, Exception& exception)

Next, I saw the message "There was no endpoint listening".  If you've worked with eConnect, you may have seen this error, which can be caused when the eConnect service is not running.  The message helpfully indicated that the SessionCentralService was not available, so that led me to check Windows Services.


Sure enough, when I found the GP Session Central Service, it was not running.  Great, just start it up, right?  Not so fast--apparently there was a reason why it wasn't running.


And there it is.  The service can't start due to a password issue.  I hadn't set the "Password never expires" option for the service account, so the password had expired.  I fixed the account and the service started up properly.

With the service started, the web client started working again.


Fortunately, this particular web client error was very simple and easy to track down, but I think it's a nice simple example of web client troubleshooting.

Off to configure Workflow!

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Who actually uses the Dynamics GP Multi-Tenant feature?

By Steve Endow

When Microsoft announced that Dynamics GP 2013 would include Multi-Tenant support, there was some buzz.  The ability to have multiple separate Dynamics GP installs on the same SQL Server instance seemed like an obviously beneficial feature, and during presentations at Tech Airlift, it was explained that the functionality was largely designed for partners looking to host Dynamics GP.

But even for GP partners who needed multiple demo environments, or consultants who needed different GP configurations on a laptop, multi-instance would seem to have some benefits, right?

Since then, I've spoken with some customers, partners, and a few GP hosting providers, and I haven't found any that use the multi-tenant feature.

Quite a while ago, I had a call with a large Dynamics hosting provider, and when I asked if they used the GP multi-tenant feature, I thought I heard a scoff as a "No" response came back immediately.  When I asked why, I was told that having a separate SQL Server instance per customer was easier for them to manage, and that the multi-tenant feature would add more complexity than benefit.  Admittedly, I'm not an expert on GP hosting, but I was surprised that the feature that was supposedly designed for GP hosting partners was completely dismissed by an actual hosting partner.

During the reIMAGINE 2014 conference last week, I attended several sessions on the new GP 2015 Service Based Architecture (SBA) functionality.  In a few of the sessions, the speaker, from Microsoft, discussed several design elements and features of SBA that existed to accommodate multi-tenancy.  One statement was along the lines of, "We did this at the request of some of our large hosting providers."  So apparently someone is using multi-tenancy?

After one of the presentations, I visited a vendor table for one of the GP hosting providers.  I asked them if they used the GP multi-tenant feature.  After double checking with an engineer at their office, the salesperson confirmed that they did not use multi-tenancy--they use a separate SQL instance for each customer.  Hmmm.

After the conference was over, I was in the palatial lobby of the Holiday Inn Fargo, and I happened to chat with someone who worked with a different Dynamics hosting provider.  You can guess what my first question was.  Once again, I was told that they also do not use the GP multi-tenant feature.  When I asked why, the explanation I received was that they regularly have partners request 'sa' access to the SQL Server (because GP often requires SQL access to fix issues, they explained), so it is easier to have a separate SQL instance to accommodate the SQL access request.

However, the person told me that they did offer GP hosting in Azure, where they do utilize the GP multi-tenant feature, but that hosting package is more limited than their internally hosted GP solution, and presumably they do not offer 'sa' or direct SQL access to the Azure customers.  Not sure how they fix GP SQL issues on Azure.

Last, during a lunch discussion at the conference, I asked a few partners if they have ever used or implemented GP multi-tenant.  One person said that they had used the feature for a customer that had two instances of GP, for two different legal entities.  The person said that it turned out to be a mistake, as they have had numerous technical issues with the multi-tenant install and will never use it again.

So I guess I'm surprised and puzzled.  GP multi-tenancy was a pretty big deal when it was announced, but if 3 large Dynamics hosting partners don't want to use it, who are these "large hosting providers" that Microsoft is working with as they invest in architecting new features to support the functionality?

Do you know of any Dynamics hosting providers who use the feature?  If so, who are they?  I'd love to chat with them.

If you are a hosting provider or partner that has chosen not to use the multi-tenant feature, why not?  What limitations or issues are causing you to avoid it?


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Wednesday, November 12, 2014

reIMAGINE 2014: Pushing the Boundaries of .NET Development

By Steve Endow

I'd like to thank everyone who attended the fantastic reIMAGINE conference in Fargo this week, and the brilliant Dynamics GP consultants and developers who attended the .NET Development presentation with me and Andrew Dean of Envisage Software.


As promised, here are the presentation slides and .NET sample projects that were discussed in the presentation.

Presentation:   http://1drv.ms/1EtJFVC

Sample code:  http://1drv.ms/1EtJM3x


Please note that you will need to edit some of the projects to supply your own GPConnNet license key in the DataAccess.cs class for those projects that perform data access.  And in the Bitcoin integration sample, you will need to supply your own Coinbase account information and keys.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter