Tuesday, January 17, 2017

Benchmarking GL batch posting times in Dynamics GP using DEX_ROW_TS?

By Steve Endow

I just finished a call with a customer who seems to be experiencing relatively slow GL batch posting in Dynamics GP.

We were reviewing records for the GL batch in the GL20000 table, and out of curiosity, I happened to look at the DEX_ROW_TS values.  For a GL batch that had a total of 1,200 lines, the difference between the minimum and maximum DEX_ROW_TS values was just over 60 seconds.  So my interpretation is that it took over 60 seconds for GP to perform the posting and copy the records from GL10000 to GL20000, with the TS field time stamps reflecting that processing time.

There could be many reasons why DEX_ROW_TS isn't the most accurate measure of actual batch posting times, but I was curious if it could be used as a way to roughly and quickly benchmark GL batch posting times.

I didn't know if 60 seconds for a 1,200 line JE was fast or slow, so I performed a few tests on one of my development VMs.  I created two test batches:  One had 150 JEs with 8 lines each, and the other had 300 JEs with 4 lines each.  So each batch had 1,200 lines.  I posted both batches, and then ran this query on them:


SELECT MAX(ORGNTSRC) AS Batch, COUNT(*) AS Rows, MIN(DEX_ROW_TS) AS StartTime, MAX(DEX_ROW_TS) AS EndTime, DATEDIFF(ss, MIN(DEX_ROW_TS), MAX(DEX_ROW_TS)) AS SecondsElapsed
FROM GL20000 
WHERE ORGNTSRC = 'TEST150'
UNION
SELECT MAX(ORGNTSRC) AS Batch, COUNT(*) AS Rows, MIN(DEX_ROW_TS) AS StartTime, MAX(DEX_ROW_TS) AS EndTime, DATEDIFF(ss, MIN(DEX_ROW_TS), MAX(DEX_ROW_TS)) AS SecondsElapsed
FROM GL20000 
WHERE ORGNTSRC = 'TEST300'

(If you use this query, note that if the same batch ID has been used more than once, you will need to filter the query to ensure you only measure a single posting of the given batch ID)

Here are the results:


As you can see, my test batches showed DEX_ROW_TS elapsed times of 6 and 8 seconds, respectively.  So my test JEs appear to have posted significantly faster--up to 1/10th the time as the customer.

It's no surprise that my test in the virtually empty TWO database will show faster times than a large production database, but 6 seconds vs. 60 seconds is a pretty big difference.  And having worked with hundreds of customers to automate their Dynamics GP posting processes using Post Master, I am pretty sure that this customer is seeing less than optimal SQL performance, and that I'll be having a few more support calls with them in the future.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+




Dynamics GP obscurity when voiding AP payments in a high volume environment

By Steve Endow

I seem to frequently work on unusual and obscure tasks and issues in Dynamics GP, and I discovered another one recently.

I have a large Dynamics GP customer that issues thousands of AP payments every month.  The payments are issued to dozens of countries using every payment mechanism imaginable.  Checks, ACH, wires, debit cards, PayPal--you name it.  The payments are issued from both Dynamics GP and through at least one third party international payment processing service.

The company issues so many payments in so many forms that they have a very interesting and challenging problem.  The problem stems from the fact that they regularly encounter situations where the payment is not successfully delivered to the vendor.  Maybe the check was returned as undeliverable.  Perhaps the ACH info wasn't correct.  Maybe the PayPal email address was wrong.  Given the number of different payment methods they use, sometimes they discover this in a few days, while sometimes it takes a few months to be notified that a payment was not successfully delivered.  Given their high payment volume, the challenge this creates is having to void hundreds of payments a month in Dynamics GP so that they can re-issue the payment.

The void process is so time consuming for them that they asked me to develop a solution that could automatically void payments in GP.  I developed that solution, which is a very long story on its own, but in the process of testing, I discovered an unusual scenario that made it difficult to automatically void a payment.

The issue is that it is possible to issue the same check or payment number from multiple checkbooks in Dynamics GP.  This isn't something I had considered before.  So if I pay a vendor with Check 100 from Checkbook 1, and then later happen to pay that same vendor with Check 100 from Checkbook 2, the vendor now has two payments with the same check number.  Given the number of GP checkbooks, the number of payment methods used, and the fact that a third party payment processor is involved, I couldn't rule out this possibility.

Here's an example of what that scenario looks like in the Void Historical Payables Transactions window.


Even if you filter the vendor and the document number, the window displays multiple payments.  In the screen shot, I used the extreme example of payments with the same date and amount.  In this case, the only way to tell the difference between the two payments is by the internal GP Payment Number value.

A user who is manually performing a void would have to select a row in the scrolling window and click on the Document Number link to drill in to the payment and see which Checkbook was used to issue the payment.  But because the Checkbook ID is not shown on the window, an automated solution just looking at the data in the scrolling window cannot tell which payment should be voided.  So I'm probably going to have to enhance the automated solution to verify the date and amount shown in the grid record, and also lookup the payment number to determine which checkbook issued the payment.

One could reasonably say that it is unlikely that a vendor would be issued two payments from two checkbooks with the same payment number.  I would have previously agreed, but the fact that this issue happened to come up in my limited testing on my development server would seem to indicate that it could be more likely than you might think.  And if you've worked with ERP systems long enough, you know that if an obscure problematic situation can arise, it usually will.

I thought that this was a good example of how flexible functionality in Dynamics GP and unexpected or complex scenarios can produce a situation that requires custom solutions to handle unusual situations, even if unlikely.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+

Monday, January 16, 2017

What Are Your Software Resolutions?

Some of my favorite clients, when I walk in their door every few months, ask "What's New Out There?", "What Are People Doing?"  I will admit, I just love the continual growth mindset.  Although it does take time and energy (and money) to leverage your software to its fullest potential, I find that clients who take this on as part of software ownership are generally happier and more satisfied than those who tend to stagnate- never looking at new approaches, add-ons, or taking care to expand their use of new functionality as appropriate. 


So along these lines, I thought I would put together my top 5 software resolutions.  Although written with Dynamics GP and CRM in mind, these really can apply to a myriad of software solutions and vendor relationships you may have.


  1. Stop expecting software to do more without you contributing more: Whether it is time, expertise, or money (in the form of consulting dollars or add-on software), your software package will only expand and do more for you if you are willing to contribute.  Some of my clients who do the best with this resolution have monthly GP user meetings (internally) to discuss issues and goals and also participate in GPUG and other groups to knowledge-share.  In organizations that don't regularly do this, it's not unusual to hear about them simply implementing another product a few years down the road and starting the cycle again.
  2. Build a partnership with your VAR/Consultant.  No one likes to have a combative relationship (consultants, too).  Understand that your partner is there to help you, and in most cases wants to make sure you are happy with them as well as the software.  So look at how you engage with them, do you do it in a proactive way? Do you ask them what they think of how you are using the software?  Ask for their help in more strategic ways, like how you might better use support or even cut your support costs through training or other avenues.
  3. Set a budget for ongoing software enhancement.  And I am not just talking about service packs and upgrades, although it can be bundled in with those costs.  With each new release, there is new functionality and we (partners/consultants) want you to be able to take advantage of it.  But in a lot of cases, clients simply budget upgrades like service packs with no consulting services beyond the upgrade.  Even consider inviting your consultant/partner out once a year for the sole purpose of asking "What could we be doing better with our software and processes?".  You might be surprised by their answering.
  4. Reset your approach to training to be an ongoing process, not a one time event.  I know users who have used GP for 10+ years but still find training classes, webinars, and other events to attend every year and leave excited about how they can improve their use of the software.  Join GPUG.  Go to conferences.  Treat training as something you do every year.  Not just when you add a new employee or implement a new module.
  5. Recognize that software won't solve all of your issues.  Above I mentioned clients who have monthly internal GP user meetings. These opportunities can also be opened up to include accounting and business processes, even those that fall outside of the system.  What is working?  What isn't?  And can software help? Or do you need to consider internal changes?  Approaching issues with an open mind, and recognizing that sometimes internal/institutional change is needed (with or without software) can help you make positive change in your organization.
What would be on your resolution list? I am interested to hear from you all!


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Sunday, January 8, 2017

Portable DIY Surface Pro Table Top Stand / Lectern: Computer Woodworking

By Steve Endow

Last week MVP Jen Kuntz posted a neat update on Twitter with some photos of a cool sliding door that she built.

Following her lead on the woodworking post, I thought I would write a post about a small woodworking project that I worked on today.  Computer related, no less!

I needed some type of table top stand for my Surface Pro 4.  I have a situation where I need to work on my Surface Pro while standing, but the space where I'll be working only has a small table.

I didn't want, or need, a typical boxy table-top lectern.  I wanted something simple, compact, portable and light, that I could quickly setup for use, and then easily fold up and put away.  Unlike a typical lectern with an angled top, I wanted a flat surface so that my Surface Pro and my mouse would not slide off. (If you've done presentations with a typical angled lectern, you know what I'm talking about.)

I fired up SketchUp and quickly came up with this simple design, which has a flat top and folding support legs.  I wanted to keep it as simple as possible so that I could quickly build it this afternoon with as little wood and as little effort as possible.

The folding legs would be attached to the back piece with hinges so that they could be moved into place to support the top, and the top would be on a hinge as well, allowing it to fold down.


After some initial testing, I realized I needed to add a folding stand in the back to prevent it from tipping backwards.


With the legs folded flat, the hinged top folds down flat, and the top has a convenient carry handle.  I figured this would make it very easy to setup, and then I could fold it up in 2 seconds and easily store it out of the way, taking up minimal room.


With my rough design in hand, I headed out to the wood pile. Um, I mean my garage.  If you are a woodworker, or know any woodworkers, you probably know that we hate to throw away perfectly good scraps of wood.  You never know when that small off cut will come in handy!

Fortunately, I had the perfect scraps for the project.  I had a scrap of maple plywood that was almost exactly the dimensions of the top, a nice piece of poplar for the center back, and I had just enough select pine scraps for the folding legs.

The select pine was slightly narrower then my SketchUp design, so I had to adjust my dimensions a bit on the fly, but it worked out just fine.


I cut the pieces to length on the miter saw, and things were looking good.  To save time, I didn't bother to taper the legs, like what is shown in the design.


To join the folding legs, I used my Festool Domino, but pocket hole screws would probably work fine as well.


The Domino is a bit tedious to setup, but the results are Extra Fancy.


With the legs glued and assembled, I clamped them up and then moved on to work on the top piece.


The scrap of plywood was so close to my design dimensions that I didn't even have to cut it--it was ready to go.  I just needed to cut the handle out.

I sketched out the area for the handle and I used a large forstner bit to start the handle hole.



At this point, most people would use a jigsaw to cut the piece between the two holes, but 1) I absolutely hate the jigsaw, and 2) I got a new compact router recently, so I figured I would take the path less traveled and cut out the handle with a spiral up cut bit.


So the router was an interesting choice.  The cut didn't turn out perfect, but it was convenient and good enough for this project.

I then got Extra Fancy and chamfered the edges of the handle--again, another excuse to use the router.


Then, every woodworker's least favorite task--sanding--to remove any rough edges.


More chamfering around the edges of the top. Because new router!


With all of the pieces done, I did a dry fit of sorts, just to make sure everything looked right.


Then a quick run to Home Depot to pick up some hinges.  If you want to get Extra Fancy, you could go with piano hinge for just a few dollars more, but I didn't want to spend time cutting the piano hinge, so I opted for the ugly utilitarian hinges.


And with all of the hinges in place, the stand worked perfectly.


And it folded up nice and flat.


It's very light weight and the handle makes it really easy to carry.


A quick test on a table confirmed that it worked great with my Surface Pro and mouse.


During my initial testing, I noticed that it could potentially tip backwards, so I grabbed another small scrap of plywood (perfect size!) and with the one remaining hinge, added the extra stand on the back to prevent it from tipping over.


To finish it off with a touch of Extra Fancy, I'm going to counter sink a few neodymium magnets into the top of the legs and bottom of the table so that the legs will pop into place and be held by the magnets.  I'll probably also add a magnet to the stand on the back to keep it folded flat when closed.

I hope you enjoyed this computer woodworking fusion project!



You can also find him on Google+ and Twitter




Wednesday, December 28, 2016

What Do You Know About Your Customizations?

When I first started consulting over 15 years ago, it seemed like we didn't come across that much customization.   Maybe it was because we didn't have a developer on staff, maybe it was the nature of the clients we served, or maybe it was just indicative of the time.  But over the past 15 years, I've seen a growth in customization and integration on even the smallest of projects.  I attribute this to a number of things including the growing sophistication (and therefore) expectations of clients (even on the smaller end), the release of additional development tools that decrease the effort involved, and even a changing mindset that customization can help "unleash" the potential of your software. For whatever the reason, it seems like customization at some level has become a norm of sorts. 


Let me also add that when I say "customization" in this post, I am including integrations and interfaces between systems as well.


With clients we have implemented, and those we have picked up over time, the tendency  seems to be to "trust the professionals" with the customizations.  While I agree with this on one level, in terms of who should be doing the actual development-- I also would emphasize that every client/power user needs to understand their customizations on several levels.  This due diligence on the client/power user side can help ensure that the customization...


  • works in a practical sense for your everyday business
  • is built on technology that you understand at a high level
  • can grow with your business
  • is understood in a way that can be communicated throughout the user base (and for future users)
I often find that over time, the understanding of a customization can become lost in an organization.  Give it 5 years, and current users will bemoan that they don't understand...


  • why they have customization
  • what the customization does
  • how the customization can be adjusted for new needs
While the IT admins will bemoan a lack of understanding of how to support the customization effectively and/or perpetuate misunderstandings regarding the technology and capabilities.


None of this is anyone's fault necessarily, but it does emphasize the need for due diligence anytime you engage with a consultant or developer for a customization (and even a worthwhile endeavor to review your existing customization).  What are the key  parts of the due diligence I would recommend?  Well, you KNEW I was going to get to that!  So here you go...


  1. Every customization you have should have a specification. It doesn't have to be fancy, but it does need to be a document that contains an explanation of the functionality of the customization as well as the technology to be used to develop it.   Ideally, it should also contain contact information regarding the developer.  I'll be honest, I run in to clients wanting to skip this step more than consultants and developers. I suppose this has to do with not seeing the value of this step, seeing it as a way for consultants to bill more.  But this step has the greatest impact on a client's ability to understand what they are paying for, and minimizing miscommunication and missed expectations.  If you don't have them for your existing customizations, ask for them to be written now (either internally or by the consultants/developers). On another side note, somewhere the actual process for using the customization should be documented as well. Sometimes this is added to the spec, sometimes if is added to other process documentation. But make sure it happens, so that the customization is included in training efforts internally.
  2. Understand at a high level what technology is being used to develop the customization.  Why do you need to know this?  Well, you need to understand what is involved in upgrading the customization for new versions?  How about adjusting or adding functionality?  Will it mean code writing, or something simpler?  How about source code?  Does the customization have  source code, and who owns it (in most cases not the client but the developer/company retains ownership)?  What does that mean if the developer stops working for the company?  Or if you change companies?  Will there be a detailed technical design document to be used if other developers need to be engaged? And is the technology specialized (e.g., Dexterity) or more common (e.g., VB, .Net, etc). All are important questions that impact the longevity and flexibility of the customization.
  3. Conduct full testing with a method for collecting feedback internally, so that you can ensure that the customization indeed meets expectations and enhances the user experience.  It is not uncommon for a customization to be developed per the specification but in practice still need adjustments to make it truly useful for the users.  When this happens, clients will sometimes "stall out" out of fear of additional costs.  Even though, in the long run, the additional costs that might be incurred at this stage could save frustration as well as future replacement costs when the customization is abandoned.  Just make sure during this point in the project, that the spec and process documentation are updated with changes.
What else would you add to the due diligence for clients and customizations? Let me know!


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.


Tuesday, December 27, 2016

Source code control is a process, and processes are prone to mistakes

By Steve Endow

I previously thought of source code control as just another piece of software--an application that you use that manages the versions of your code.  SourceSafe, SVN, Git, Team Foundation Server, there are many options for software or services that will take care of source code control / version control for you.  As long as you "use" one of those solutions, you're all set.

But today I learned a pretty significant lesson.  It is probably obvious for many people, but it was a bit of a wake up call for me.

"Source code control" is not Git or Team Foundation Server or VisualSVN.  Those are just tools that are just one piece of a larger process.  And regardless of which tool you use or how great that tool is, if you don't have a solid, resilient process surrounding that tool, you will likely experience breakdowns.

Last week I sent out a new version of a Dynamics GP customization.  The new version only had a few minor changes to enhance error handling--I added several try/catch blocks to try and track down an intermittent error that the user was seeing.

The customer tried the new version today, but they quickly noticed that a feature that was added over 3 months ago was gone. Uh oh.

While making the latest error handling changes, I noticed some oddities.  The last release was version 1.30, but the projects in Visual Studio were still set as version 1.21.  I checked the version 1.30 files that were released, and they were versioned properly, so something happened that caused the Visual Studio projects to revert to version 1.21.  I wouldn't have done that manually.

I then checked my Documentation.cs file that I maintain on all of my projects.  The last note I had was for version 1.21.  No notes on version 1.30.  That's not like me, as that's usually my first step when updating a project.

I then checked the Git branch of the project.  Visual Studio was using branch 1.31, but it was only a local branch and hadn't been published to BitBucket.  1.30 was published, but it didn't have any notes on version 1.30 in my local repository or on BitBucket.

I checked the Commit log on BitBucket, and that left me even more puzzled. I didn't seem to have any commits acknowledging the version 1.30 release.


I see check ins for v1.2 and v1.21, and the new v1.31 release, but nothing for v1.30.

Somehow I had produced a version 1.30, with correct version numbers in Visual Studio, which produced properly versioned DLLs, which got released to the customer, but I have the following problems:

1. I either didn't update my Documentation.cs file, or it somehow got reverted to a prior release, causing my changes to be wiped

2. Somehow my Visual Studio project version numbers got reverted from 1.30 to 1.21

3. I can't find any record in the code for the version 1.30 changes

4. Despite having v1.30 and v1.31 branches in Git, I didn't see any changes when comparing them to each other, or to v1.21.

5. I can't find any evidence of a version 1.30 release in BitBucket


The only evidence I have of a version 1.30 release is the separate release folder I maintain on my workstation, where I did document it in the release notes.


And I see that the DLLs were definitely version 1.30, so I'm not completely imagining things.


So somehow, I managed to make the following mistakes:

1. Potentially reverted my code to a prior release and lost some changes

2. Didn't clearly perform a v1.30 check in, or if I did, my commit comments did not indicate the version number like I usually (almost always) do

3. Created a v1.31 branch for an unknown reason that I didn't publish and didn't document.

4. Somehow made what is likely a series of several small mistakes that resulted in the versioning problem that I'm trying to fix today.


The most frustrating part is that it isn't obvious to me how such a roll back could have happened.

And all of this despite the fact that I'm using an excellent IDE (Visual Studio), an amazing version control system (Git), and a fantastic online source code management service (BitBucket).

My problems today have nothing to do with the tools I'm using.  They clearly stem from one or more breakdowns in my process.  And this was just me working on a small project.  Imagine the complexities and the mistakes and the issues that come up when there are 15 people working on a complex development project?

So today I learned that I have a process issue.  Maybe I was tired, maybe I was distracted, but clearly I forgot to complete multiple steps in my process, or somehow managed to revert my code and wipe out the work that I did.

I now see that I need to invest in my process.  I need to automate, reduce the number of steps, reduce the number of manual mistakes I can make, and make it easier for me to use the great tools that I have.

I don't know how to do that yet, but I'm pretty sure that much smarter people than I have had this same issue and come up with some good solutions.



You can also find him on Google+ and Twitter




Wednesday, December 21, 2016

Approaching Acquisitions with Dynamics GP

Sometimes in this line of work, you get so used to doing thing one way, it take a bit of jolt to remind you that there are other ways to approach things.  Sometimes that jolt comes from a coworker's comment, or a client  asking a question.  I like those moments, because they encourage innovative, creative thinking.  And innovative, creative thinking challenges me and, honestly, makes this job a whole lot more fun!

One example of this sort of situation is with acquisitions.  Specifically, situations where a company on GP  is acquired but has plans to stay on GP through the transition.   Typically, this means the following...


  1. Identify closing date of acquisition
  2. Set up a new company database
  3. Transfer over acquired assets and balances as of the transition date

This approach works well when  you are dealing with a single company. Or maybe a couple.   It works because its...

  1. Straightforward
  2. Relative simple
  3. Clean (the client can  keep the  history of the former company in the old company while the new company starts fresh)
Where this  process doesn't work so well is when we starting talking about...

  1. Lots o' companies
  2. Lots o' data/modules/customizations/integrations
In these cases, the idea of setting up multiple brand new companies, copying data, ensuring that customizations/integrations work, can be a bit daunting in the midst of an acquisition.  This is doubly true if the customizations/integrations support critical day to day business operations.  Those of you that know me know that I don't believe that just because something is "daunting" that we shouldn't do it.  But these "daunting" things do mean we have to approach the project with a higher level of due diligence in advance as well as project management during to mitigate risks.

So what about another option?  Can we avoid setting up all these new companies?  Yes, we can.  It just requires a bit more creative thinking.  As an alternative, we can approach it like this...

  1. Continue forward with same companies
  2. Backup companies at transition date to create historical companies
  3. Remove history as needed from live companies
  4. Enter manual closing entries as of the transition date (assuming fiscal year is not changing, and transition is not fiscal year end)
  5. Reset assets and any other balances as needed (this can be the tricky step, involving scripts to original cost basis = net cost basis, etc to move forward)

Now, the process above does require due diligence in advance as well to make sure all transition needs are identified and planned for.  But it can save effort and reduce risk in some cases.  So...a solution to consider.  What other creative/innovative approaches have you seen to handling acquisitions in Dynamics GP?  I'd love to hear from you!


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.