Monday, December 29, 2014

Dynamics GP DPS: Error initializing Process Server

By Steve Endow

I know that very few customers use the Dexterity Process Server (or Distributed Process Server?) (let's just call it DPS), but I thought I would document this in case some other lucky person runs into this issue.

When I launch DPS.exe and open a Dynamics.set file, I get the message "Error initializing Process Server".


DPS then crashes with this "BEX" error.


I tried rebooting the server, but no luck.

After scratching my head for several minutes, I was almost ready to give up and try reinstalling GP when I thought of one last thing.

The server I was using has User Account Control (UAC) enabled at the default value.  So I tried launching DPS.exe using Run As Administrator.


Fortunately, that resolved the error!

I have seen so many strange errors and unexplained symptoms that were ultimately resolved by using Run As Administrator or disabling User Account Control.

While I understand the concept of UAC, the implementation and side effects are far from ideal.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Monday, December 22, 2014

Why I replace my computers every 3-4 years, whether they need it or not

By Steve Endow

In August, my trusty old file server started acting flaky.  Given the symptoms (losing BIOS settings after reboot), I assumed the motherboard was dying.  Since the computer was several years old and had served several roles, I wasn't surprised it started to have issues.  Rather than bothering to diagnose the problem, I decided to retire the computer and replace it with a Synology DS1813+.  That has turned out to have been a very good decision.  The Synology has been fantastic and has many more features than I had available on the old Windows file server and is incredibly fast.  And the maintenance on the Synology is virtually zero, including automatic software updates and reliable backups to external drives.

Fast forward to Thanksgiving 2014.  While having a family dinner at my grandparent's house, my grandfather's ancient Dell Optiplex (purchased in 2008!) decides to die.  After reviewing the symptoms, my guess is that the power supply died, with a possible side dish of a bad BIOS battery.  So I get the "honor" of bring the computer home with me to repair, along with Thanksgiving leftovers.  Lucky me.

Having been down this road before, I happen to have a power supply tester.  Sure enough, the PSU on the Dell was dead and would only blink power for a fraction of a second before dying.  I cannibalize the PSU from my old file server, which tests out fine, and install it in the Dell.  That gets the Dell to boot up, but it fails to boot into windows--a blue screen flashes for a millisecond, and then the machine immediately reboots.  The bad power supply apparently caused an issue with the drive, and no matter what I tried, I couldn't get Windows XP to reinstall or repair or boot.  So I have to install Windows 7 on a new drive.  Fortunately I'm able to read the drive and recover all of the data, but at that point I had wasted several hours trying to get the old drive and Windows XP to work.



Since I had pulled the replacement power supply out of my old file server, I started to look at the innards of that older machine, which was having problems booting up and losing its BIOS settings.

I happened to glance at the BIOS battery on the motherboard and noticed something strange.


Sure enough, there was corrosion on the BIOS battery!  The battery was completely dead, which would explain why the machine would lose its settings with every reboot.  Apparently the motherboard does not have a feature to detect a bad BIOS battery.

So a $2 battery caused my file server to become unreliable.  The machine was pretty old, so it was due for replacement, and this is a great demonstration of why old computers are often not worth the potential risk and hassle.

My current HyperV server is 3 years old now and is running fine, but I have several mechanical drives in that machine that are over 6 years old.  I'm now replacing those mechanical drives with Samsung 850 Pro solid state drives, since I don't want to have to deal with a drive failure.

And I suspect that I will be replacing that HyperV server next year when it turns 4, just to avoid potential issues.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter








Friday, December 19, 2014

Versioning your Dynamics GP Customizations and Software

By Steve Endow

Earlier this year I worked with a third party product for Dynamics GP.  I believe it is all .NET, so it had a few DLLs in the GP AddIns folder, and a few additional DLLs that were used by an external application independent of GP.

While testing, we identified a few bugs.  While communicating with the ISV, we were asked which version we had installed.  I checked the DLLs and found the following version number.

10.0.0.0

Hmm, that's a little unusual, but okay, I told them we were using version 10.

Their reply was not THAT version number, but the date time stamp on the DLL files that we had installed.

Say what?

They asked us to ignore the actual version number on their software and report the date and time that was showing for their DLL files.  For their commercial software product.  That they sell.  As a software development firm.

Let's just say I was surprised.  Any developer that has used Visual Studio should be familiar with the window that lets you set and increment version numbers.  And having developed, installed, and used hundreds and hundreds of software applications, I thought that software versioning was a no brainer and one of the most basic organizational practices that a developer should be using.

To this day, that ISV still refuses to use version numbers on their software.  Dozens of releases and updates and bug fixes and changes later, their files are steadfastly stamped with version 10.0.0.0.  We never know what "version" we are using, and every time we have a support case with them, they ask us what time stamp we have, and their only solution is to try and send us files with a newer time stamp.  I wish I were kidding, because it's a confusing mess to deal with updates from them.

So that's a great lesson on how not to (not) use versioning for your Dynamics GP customizations, integrations, VS Tools AddIns, and software.

So how should you version your Dynamics GP customizations or software?

First, let's agree that you should definitely version your software.  Any method is better than no method.

As for version numbering, different people and organizations have different practices, but here's the system I've adopted.

I typically use a format of "1.23".  Even though Visual Studio has 4 separate values, I combine values 2 and 3 to make the number more readable, and don't typically use the 4th value.   I like 1.23 much better than 1.2.3, but as you'll see shortly, this has a potential downside.

I start with a version 1.00.  If I have some minor bug fixes, I'll increment to 1.01, 1.02, 1.03, etc.

If I add a new feature or fundamentally change some functionality, I'll increment the second digit, such as 1.10, 1.20.

As the version numbers increase, I personally try and avoid going above 9 for the second and third digit.  I did it once, and it ended up being quite confusing.  Since I combine the 2nd and 3rd version segments, how do I represent version 1.1.0 vs. 1.10.0?  If I use 1.10, do I use 1.100 for version 10?  It doesn't work well with my typical version numbering.

When the customer upgrades to a new version of GP, I'll typically increase the major version number, so I'll go from 1.23 to a fresh 2.00.

However, in cases where I am developing software for use with multiple GP versions, like the software that I sell, I now include the GP version in my software version number.  So I'll have 10.1.23, 11.1.23, and 12.1.23.  That way I can know that the customer has the correct version for their version of GP, and during support calls, I can see my version number and know which version of GP the customer is using without having to ask or look it up.

In a new twist, last week I released a web service that integrates with GP.  As the customer tested, they wanted rapid bug fixes, so I would fix a bug or two and push a new release multiple times in a single day.  My version numbers were quickly escalating, so I enlisted the fourth version number in Visual Studio.  So for minor updates or bug fixes, my version numbers would increment like 2.23.1, 2.23.2, etc.  It has worked well so far and hasn't been inconvenient or confusing.

But even with these frequent updates, I always increment the version number.  It keeps things organized, avoids confusion, and every customer and consultant I have worked with understands the concept of version numbers, so the practice makes communication about updates and new versions easier.

For customizations or integrations with a visible window, I always display the version number on the form.  Never make your customer search for a version number when they need support or when they need to check that they are on the latest version.  Just always add it to at least one of your forms or to Help -> About--it takes 2 minutes and it will make life easier for everyone.


In this case, the form is displaying 12.1.2.5 because that is the actual version value from the .NET application, but I would refer to it as version 1.25 for GP 2013.

For Modifier and VBA customizations, I have sometimes added the version to the GP windows.


So far I've accomplished this with a simple text label in Modifier that I have to manually change with each release. It isn't fancy, but it's simple and easy.

Whatever you do, please, please, please do not rely on file date time stamps to version your software.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


eConnect mystery: Imported PM Manual Payments suddenly have no distributions

By Steve Endow

I have a Dynamics GP customer that uses my AP Payment and Apply Import.  It has been working fine for over two months, but suddenly this week when the customer noticed a problem.

The PM Manual Payment transactions would import without error, but the transactions didn't have any distributions.  When they viewed the transaction distributions in GP, the distribution scrolling window was completely blank.

Hmmm.

The manual payments are not receiving distributions as part of the eConnect import--the import is letting the cash and payables values default, so it isn't a bug in the import.

The GP partner said that the only event they can think of was that the company database was restored recently.  That alone shouldn't cause any issues, so I couldn't of think how that might relate to the issue, but it's a potential clue.

We reviewed the default GL accounts for the vendor, checkbook, and company, and default accounts were present in GP.  The partner also checked the eConnect event log, but there were not errors or clues related to the missing distributions.

As a test, the partner entered a manual payment into Dynamics GP to mimic the imported transaction for the same vendor and checkbook, and that transaction was fine.  The distributions defaulted properly.

They then tested the import on another company.  The imported payments were fine--they had distributions.

We then turned off the eConnect service and ran the import.  This forces an error to occur and has the import display the resulting error message, which includes the full XML document that was sent to eConnect.

The XML looked just fine, and was identical to the values I got when I tested successfully on my TWO / Fabrikam environment.

So at this point, we believe that the issue is specific to one company and believe that the issue is not related to the import application.  It seems to be an eConnect issue in the company database, but we haven't yet tracked down the cause.

My next recommendation is to query the PM10100 table to see if any distribution records are being created by eConnect.  I'm wondering if eConnect is importing invalid distributions, which are not displaying in GP even though they are present.

Another step they can take is to recreate the eConnect stored procedures, which Mariano discussed on his blog.  But I don't know if that will replace a modified procedure or resolve whatever issue might exist with eConnect.

My last option is to review the taPMManualCheck and taPMDistribution stored procedures to figure out how default distributions are generated.  We can then query the same default account values in GP to see if there is some bad value in the company database.

In the 8 or 9 years that I've worked with eConnect, I haven't seen anything like this, so I'm puzzled and intrigued.

I'm sure there is an explanation, but it looks like we're going to have to work for the answer to this one.  My guess is that there is some issue with the default account values, but I just can't imagine what scenario would cause this problem.

I'll post an update if I learn more or if we figure out the cause.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Tuesday, December 16, 2014

C# vs. VB for Dynamics GP .NET Development

By Steve Endow

At the reIMAGINE 2014 conference, Andrew Dean and I gave a presentation on VS Tools and .NET development for Dynamics GP.

At the beginning of the session, we asked the crowd several questions to get an idea of who were consultants vs. developers, who used Dex and/or .NET, and to assess experience, etc.

One question I asked out of curiosity was, for the .NET developers, who used C#, and who used VB?

A vast majority of the room raised their hands to indicate that they used C#.  It seemed that almost all of the attendees had previously indicated that they developed with Dex, so it surprised me to see that nearly all of them also developed in C#.

When I asked about VB, I believe only 3 people indicated that they were solely VB developers.  This also surprised me.  I have known several people who were VB-only developers, so I expected more of a mix, but based on our very informal survey, perhaps 95% of the Dynamics GP .NET developers focused on C#.

Interesting...


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Buttons mysteriously resize on VS Tools Dynamics GP Forms -- VST template bug?

By Steve Endow

Sometimes strange things happen when you are developing software that can be quite puzzling.  A window may behave oddly, data may get updated in an unexpected way, or an application may crash.  I tell myself that there is always an explanation, but sometimes these things may seem nearly impossible to figure out.

Last year I developed a VS Tools AddIn in Visual Studio 2010 that worked just fine.  I later added a button to a "Dynamics GP Form" in that AddIn, and it seemed to work fine.  But after some testing, the client noted that the text on one of the buttons was cut off, and asked me to make the button wider.  I thought I had already resized that button, but when I checked it in Visual Studio, sure enough, the button wasn't wide enough.

I made the button wider, confirmed that the button text displayed correctly, and sent the customer a new version of the AddIn DLL.  A few days later, they said the button text was still being cut off.  When I opened Visual Studio, the button size had changed and was again cutting off the text.  What in the world?

After trying various things, I discovered that whenever I recompiled the VST project, two buttons on the form would resize to the default button width.  One button was wider and would shrink.  The second button was narrower, and it would get wider.  No matter what I tried, the buttons would resize after being recompiled.

I finally gave up and added some run time code that would resize and reposition the buttons.  That worked, but clearly there was some underlying issue.

Fast forward to December 2014, and I'm now seeing the same symptoms in a brand new AddIn project using Visual Studio 2013.

This is what the form looked like when I originally created it.


After doing some testing, I noticed that the button had resized and looked like this.


The first time I thought I had just forgotten to resize the button, but the second time I realized that I wasn't imagining things.  The button size was definitely changing.

But with this new project in VS 2013, the button was not resizing when I recompiled, like what I saw with VS 2010.

After trying a few things, I discovered that when I closed Visual Studio 2013 and reopened the solution, the button would resize.


While this may seem trivial, try designing a complex window with a dozen buttons that all resize every time you reopen your project.  It's a nightmare.

My only guess is that this is some type of obscure bug in the Dynamics GP VS Tools form templates, as I have only seen this issue with the Dynamics GP Form template windows.

Since I have observed that the Dynamics GP Form template has a bug with the window startup position property, I'm assuming that this is another small, but highly annoying bug in the form template.

What I need to try next is working with this project on a different server to see if it might be some problem with this development server.

Has anyone else experienced this issue?


UPDATE:  I setup VS 2013 on a separate server and am able to reproduce the issue.  I'll size the buttons, save the project, then close Visual Studio.  When I open the project and view the form, the buttons are resized.

After further testing, I found that if I just close the form designer window and then reopen it, the buttons resize.

And this issue only occurs with the Dynamics GP Form objects.  If I add a standard Windows Form to the AddIn, the buttons do not resize.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Friday, December 12, 2014

Executing SQL DDL scripts from a Dynamics GP VS Tools AddIn

By Steve Endow

I've developed many Dynamics GP Visual Studio Tools AddIns that depended on one or more SQL scripts.

Simple single-line SQL statements are easy to include in your code, but what about more complex queries?  If I have large queries that are more than a few lines, I typically try and push those out to stored procedures.  Stored procs are easier to maintain outside of code, and calling a stored proc from a VSTools AddIn is cleaner and simpler than maintaining a large query in Visual Studio.

Okay, so that may be fairly obvious, but how do you deploy those stored procedures?  Previously, I've included them as SQL scripts that had to be manually executed as part of the deployment process.  While this was simple and low tech, it is easy to forget a script or two if you have a complex deployment.

And what about Data Definition Language (DDL) scripts?  What if you need a custom table for your customization or integration, or several?  What if you don't want to have the client manually run 10 SQL scripts as part of your AddIn deployment?  And what if your AddIn needs those tables and objects to be setup every Dynamics GP company database?  And what if the client has 80 company databases?  Not a fun deployment.



After all, the great thing about AddIns is that the ideal deployment can be as simple as copying a single DLL file to the GP AddIns subdirectory.  If you have to manually run SQL scripts you start to lose that benefit.

This week I am working on a new VS Tools AddIn that will have at least 2 custom tables.  I was tired of dealing with SQL scripts as part of my deployment, so I was determined to fully automate all SQL scripts.  I want my AddIn DLL to automatically detect whether the required stored procedures and tables are present, and if not, automatically create them.  I've done this once in the past, but it was messy and a nightmare to maintain.  I wanted a better, cleaner method that was easy to maintain.  Fortunately, I found one.

I first wrote this simple method to check if a table exists:

public bool PSV10000Exists()
{
    //Check if table exists
    string commandText = "SELECT COUNT(*) AS TablePresent FROM DYNAMICS.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'PSV10000'";
    SqlParameter[] sqlParameters = null;

    string result = DataAccess.ExecuteScalar(CommandType.Text, commandText, sqlParameters);

    if (result == "1")
    {
        return true;
    }
    else
    {
        return false;
    }
}


You could abstract this to be "TableExists" and have the method accept a database name and table name as parameters, but this was my prototype.

Great, so now you know whether the table exists.  What if it doesn't?

Well, then you just execute the DDL script to create the table, right?

Nope.  Not that easy.

ADO.NET is great for straight queries or individual / discrete DDL commands, but if you have a complex statement with several "GO" commands included, it can't parse the command and will throw an error.

My table creation script has several commands, each followed by a GO command.

CREATE TABLE
CREATE NONCLUSTERED INDEX
ALTER TABLE ADD  CONSTRAINT  (several of these)
GRANT


So how do you run a multi-step script with numerous GO commands?

It turns out that there is a different "ExecuteNonQuery" command in .NET, tucked away in the SQL Server Management (SMO) assemblies.  SMO allows you to execute complex multi-step SQL scripts, including DDL scripts with multiple GO commands.

To use SMO, you first need to add four SMO references to your project.  The DLLs are available at:

C:\Program Files\Microsoft SQL Server\120\SDK\Assemblies\

And you'll need to add these references:

Microsoft.SqlServer.ConnectionInfo.dll
Microsoft.SqlServer.Management.Sdk.Sfc.dll
Microsoft.SqlServer.Smo.dll
Microsoft.SqlServer.SqlEnum.dll

Once you have those references setup, you'll need to add these statements to your Data Access class.

using Microsoft.SqlServer.Management.Common;
using Microsoft.SqlServer.Management.Smo;


With that taken care of, you can create a simple method to call the SMO version of ExecuteNonQuery.  Here is mine, which relies on the Connection property of my DataAccess class to get a connection to the GP SQL server via GPConnNet.

public static int ExecuteDDLScript(string commandText) 
{
    SqlConnection gpConn = new SqlConnection();
    try
    {
        gpConn = Connection;  //Get connection from GPConnNet
        Server server = new Server(new ServerConnection(gpConn));
        int result = server.ConnectionContext.ExecuteNonQuery(commandText);
        return result;
    }
    catch (Exception ex)
    {
        throw ex;
    }
    finally
    {
        gpConn.Close();
    }
}


So now you have all of the plumbing to execute a DDL script.  Now what?

Let's start by getting our DDL script into Visual Studio so that our code can access it.  But where do you store a 20, 30, 50 line SQL script?

I am a huge fan of the Visual Studio Resources feature, which is a fantastic place for storing and managing large SQL scripts.


You define a resource name, paste in your entire SQL script, and add some comments if desired.  I like to note the version number of the scripts, as I often have to refine them during development.

Once you have your Resources setup, you can then put it all together in a simple method that actually executes the script.

I decided to create a new SQLObjects class, separate from my DataAccess class.

In my SQLObjects class, I created this new method.

public bool CreatePSV10000()
{
    string commandText = Properties.Resources.sqlCreatePSV10000;
    int result = DataAccess.ExecuteDDLScript(commandText);

    if (result == -13)
    {
        return true;
    }
    else
    {
        return false;
    }
}

Look at how simple that is!  It's essentially two lines to create your table if it doesn't exist.


No manual SQL scripts required, and your AddIn can do all of the work automatically, greatly simplifying your deployment.

You may have noticed the -13 value, which is a bit odd.  In my testing, that appears to be the integer returned by the SMO ExecuteNonQuery method when the script is run successfully.  I haven't yet been able to find a list of the return codes for the SMO ExecuteNonQuery method, so I'm still researching to see if -13 is an unconditional success response, and what other values I need to look out for.

Now that I have this method setup, I look forward to never having to include SQL scripts in my deployment guides again!


UPDATE:  I looked again at the documentation for the SMO ExecuteNonQuery--specifically the ServerConnection.ExecuteNonQuery method.  Regarding the integer return value, it says:

The return value specifies the total number of rows affected by the Transact-SQL command for UPDATE, INSERT, and DELETE statements. For all other types of statements, the return value is -1.
Since I am running a DDL script, the return value should be -1, whereas I was getting -13.  Then I realized that my DDL script consisted of numerous statements separated by the GO command.  Which made me wonder if I happened to have 13 statements in my script.  I did a test where I removed all of the GO commands--my script still worked fine, and the return value was -1.  When I added 4 GO commands, the return value was -4.

So I learned that why I got a return value of -13, and I also happened to learn that all of the GO commands in a DDL script created by SQL Server Management Studio are not required.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Tuesday, December 9, 2014

Fun with eConnect and .NET decimal rounding

By Steve Endow

There are some situations where you need to import transactions into Dynamics GP, where the source data contains line item totals that need to be broken down before being imported into Dynamics GP.

Since that probably didn't make a whole lot of sense, let me give an example.

One of my clients needs to import SOP Invoices.  Due to the nature of their business, the invoice data to be imported into Dynamics GP consists of item number, quantity, and extended price.  There is no unit price in the data.

For example:

Item:  A100
Quantity: 231
Amount:  $53,981.87

To import the line item using eConnect, you need to calculate the unit price.  If you calculate the unit price for this example, you get $233.69.  Pretty simple, right?

Well, if you verify the unit price:  231 x $233.69 = $53,982.39.  Note that this is $0.52 more than the actual line amount.

Fortunately, and surprisingly, eConnect (and the Dynamics GP client) is okay with this scenario.  You can import the calculated unit price and the extended price, even though there is a discrepancy due to rounding.

Easy peasy lemon squeezy, right?

Well, if your integration was developed with .NET, there is a caveat.  When you perform the calculation to get the unit price, you will need to round the resulting decimal value.  And this can cause a problem.

Let's use a different example for this one.

Item:  A200
Quantity: 8
Amount:  $2,089

So to get unit price:  $2,089 / 8 = $261.125

And then you dutifully round that value to two decimals with this code:

sopLine.UNITPRCE = Math.Round(lineAmount / lineQuantity, 2);

And you get a unit price of:

$261.12

Say what?

Every kid knows that .125 should round up to .13, right?

So what in the world is going on?

Well, the .NET Math.Round method has two different "rounding conventions", as this article explains.

The default rounding convention is "To Even".  This means that .125 is rounded to the nearest even value, which is .12.  

If you send a quantity of 8, with a unit price of $261.12, and an extended price of $2,089, eConnect will return the following error:

Error Number = 722  Stored Procedure= taSopLineIvcInsert  Error Description = Unit Price calculation does not match out to Extended price

It took me a while to figure this out, but eConnect uses "conventional" rounding, and in this case expects $261.13.

So how do we fix this?  Well, the Math.Round method has a parameter to change the rounding convention.

Math.Round(lineAmount / lineQuantity, 2, MidpointRounding.AwayFromZero);

The MidpointRounding option allows you to tell .NET to use the more conventional rounding method which Microsoft calls "Away From Zero".

This will produce the desired result of a unit price of $261.13 and eliminate the somewhat perplexing eConnect Error Number 722.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Wednesday, December 3, 2014

Diagnosis: Developer's Amnesia

By Steve Endow

Many years ago I was part of a corporate web development team that maintained the internal and external web sites of a Fortune 1000 corporation.  We did classic ASP web development, with projects like product catalog pages on the external web site, registration web sites for customer conferences, and internal document management and knowledge base sites.

One Monday, one of the business analysts I had been working with stopped by my gray corporate cubicle and asked if I had completed the XYZ web page. (I don't recall what exactly it was)

"The what?", I replied.

"The XYZ web page.", she repeated.

I furrowed my brow, looking at her like she might be talking to the wrong person.

"XYZ?", I asked, clearly puzzled.

"Ya, what you worked on last Friday", she slowly said, picking up that I had no idea what she was talking about.

"When?", I asked, starting to get concerned that one of us might be crazy.

"Friday.  Remember Friday?", she said, starting to get concerned.

"Um...", I said as I paused and thought.  "Um, Friday...ya, I've got nothing" I admitted.

She then proceeded to explain that she had talked to me on Friday afternoon and asked me to make some change to a web page.  It was a last minute thing, and I was the only one around, so we apparently talked about the change that she needed done ASAP.

After hearing the story, I turned to my computer, looked up the web page, checked the code, and sure enough, there were my changes, just as she had requested.  I had apparently completed the change that she requested.

I stared at the code, which I clearly wrote, with zero recollection of making the change and without recognizing the code.  I made the change, but as far as I was concerned on that Monday, I was staring at someone else's code.

Slowly it came back to me.  Friday afternoon, looking to go home, and I had to quickly make a change to a web page.  I then vaguely recalled the conversation with her just over 2 days ago.  I recalled the general request, but had no memory of any of the details.

At the time, I chalked it up to doing the work on a Friday afternoon and then having a great weekend where I didn't think about work for 2 seconds.  (Oh, how I miss those...)

But I now think that scenario is a symptom of what I'll call "Developer's Amnesia".

This came up because I was working on a Dynamics GP integration yesterday.  It was a modification to an existing integration I had developed between a web site and Dynamics GP, and I needed to import an AR cash receipt.

I emailed my colleague on the project, asking what values should be used for the cash receipt type, credit card ID, and checkbook ID.  She replied, "Why don't you just use the same values that you used on the other cash receipt import that we did for the client?".

"Huh?"  I re-read her email, thought about it for a few seconds, and a few details slowly started to slugglishly bounce around my brain.  "Um, ya, sure, that makes sense, forgot about that!" I sheepishly replied.

Then I connected to my development server and pulled up the Visual Studio code for that project.  I checked the date on the files, and saw that I had created the last project 7 weeks ago, and last updated it 5 weeks ago.  Somehow, in those 5 weeks, my memory of the project had been wiped.  Totally wiped.

I definitely remembered the project--the name and what it did in general, but somehow I forgot that it involved an AR cash receipt.  I pulled up the code and reviewed it, and I felt like I was having an out of body experience.

I was looking at the code, but I didn't recognize it.  At all.  Did I write this?

I slowly remembered some of the objects and methods, but it was a general vague recollection.  When I reviewed some specific lines and object properties, I had zero recollection of the details.

The funny part was that the code was beautiful!  "Wow, this guy can code!  Who is this C# genius?"  It has clean objects, clean methods, and an elegant design.  But there was enough complexity that in a few areas it wasn't obvious how some things worked.  "Hmm, this guy needs to comment his code better!" I thought ironically.

Apparently the code was all pretty obvious at the time, but since then, I clearly had been stricken with Developer's Amnesia,   After 30 minutes of working with the code, I'm still piecing it all back together.

I emailed a developer friend, explaining what just happened and asking him if he has ever had such an experience.  He quickly replied that he's had the exact same experience, several times, and that he knows exactly how I feel.  Well, at least that is somewhat comforting, knowing it isn't just me.

I then recalled that conversation I had over 10 years ago with that business analyst, which made me wonder how it was possible that I could forget something, that involved thousands of lines of code, so completely.  10 years ago I was that much younger, so I can't blame age.  And there are tons of useless trivial details that I remember from other projects I worked on years ago, so I am pretty sure I'm not forgetting everything.

My only guess is that I experience Developer's Amnesia when I complete a project quickly and have little or no follow up work.  My theory is that if I only spend several intense days working on something and slam it out, the project never transfers from my short term memory to my longer term memory.  Once I release the project and it is deployed smoothly without any follow up work, I move on to the next project, and my short term memory of that project gets wiped.

For projects that I work on part time over the course of a few weeks, more context develops and there is more time for the project to get transferred to my long term memory.  While I obviously can't remember every detail, I can recall most of the project.  There are long term projects I've worked on where I remember a surprising number of details, to the point where the customer emails me to ask how an integration works, and I explain the details that we discussed 3 or 4 years ago.

Whatever the reason, it seems that there are times when developers just completely forget some project or some code that they've written.  So when you ask a developer a question and they look at you funny and act like they think you're making things up, they might just have Developer's Amnesia.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter