Wednesday, November 18, 2015

Want to become a Dynamics GP Visual Studio developer?

By Steve Endow

Today I received a question on Experts Exchange, asking how a developer who knows C# could learn how to develop with Dynamics GP.  Specifically, the person was looking for a class to learn how to develop Visual Studio solutions for Dynamics GP.

I thought it was a good question, but it's one of those questions where there isn't a simple answer.

Here's how I responded.


Unfortunately, I don't believe there has ever been a formal Visual Studio developer training or curriculum for Dynamics GP.  The closest I've seen is on-site custom training conducted by a GP partner at a customer site (I have done such training for a customer).

I think there are two things you'll need to become a GP developer using Visual Studio:

1. Strong knowledge of Dynamics GP design, architecture, quirks, etc.
2. Knowledge of the Dynamics GP developer tools and APIs

#2 is pretty standard developer self-learning.  There is some decent documentation and a handful of samples to get started, but some areas will require some digging and time to understand and utilize.

Depending on how much you know about GP, #1 will be the more difficult part.  There is a lot of subtle background knowledge that helps inform how to design and develop Visual Studio customizations and integrations for Dynamics GP, but there is no simple training session that can capture it all.

You could start by researching the following:

1. Leslie Vail's book, Developing Microsoft Dynamics GP Business Applications, might be a good place to start to get oriented on the tools and options for GP development.

2. Read the MS documentation for Visual Studio Tools (VS Tools) for Dynamics GP

3. If you foresee a need to do any integrations, look into eConnect and GP Web Services.  I only use eConnect, and haven't found much value for Web Services, but it would be good to be aware of both.

4. Setup Dynamics GP on a development server or create an Azure VM using one of the Azure Dynamics GP images to get access to Dynamics GP and start developing some simple projects.

There are a fair amount of blog posts or articles online about Visual Studio development for GP, but I believe most cover a single aspect or small topic, so it will take some searching to get a big picture of the options and processes.

In short, you can do it yourself, but it will require some time. If you have the budget to hire an experienced GP developer to work with you one-on-one, that will probably save you time and make the learning curve easier.


If anyone has any additional tips on how to become a Dynamics GP Visual Studio developer, please let me know.

 Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Tuesday, November 17, 2015

eConnect 2015 Error: The pipe endpoint EntityOperations could not be found

By Steve Endow

Back in the now distant year of 2010, I wrote a post about an eConnect error:  There was no endpoint listening at net.pipe.

That has been one of the most frequently accessed posts on my blog.

Well today I encountered a strange variant at a client.  I had upgraded an existing integration from GP 2010 to GP 2015, and the new GP 2015 integration was installed on the same workstation where the 2010 version had been installed.

When the client ran the integration, she received this error:

eConnect error: System.IO.PipeException: The pipe endpoint 'net.pipe://localhost/Microsoft/Dynamics/GP/eConnect/EntityOperations' could not be found on your local machine.

Since this wasn't the usual "no endpoint listening" message, I wasn't sure what the cause could be.

But since it involved the words net.pipe and endpoint, I checked the eConnect 2015 Windows Service to see if it was running.

Interestingly, the service was not running.  After starting the eConnect 2015 service, the user ran the integration again and it worked fine without error.

I have tested two of my GP 2015 development machines, but when I stop the eConnect service and attempt an integration, I get the standard "There was no endpoint listening at net.pipe" error message.

I'm not sure why the client saw something different, but since it's the first time I've seen it, and since it happened to occur with eConnect 2015, I thought I would document it in case someone else comes across this variant.

If you do see it, check the eConnect windows service and see if it is running.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Monday, November 16, 2015

Supporting a Dynamics GP ISV solution involves far more than just product support

By Steve Endow

While reselling and supporting Post Master for Dynamics GP, I've worked with hundreds of customers who have reviewed, installed, tested, and implemented Post Master.

Most customers or partners do everything themselves without requiring any assistance, while a few need some assistance with the configuration and testing.

Then there are some that have an issue and need actual support.  But by support, I don't just mean Post Master support.  That would be easy.

Over the course of the last 6 years supporting Post Master, I've seen some of the strangest issues, many of which had nothing to do with Post Master, yet I've had to help the customer resolve them so that they could use Post Master in their environment.

For example, this is one of my favorites:

A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The semaphore timeout period has expired.)

I've had a handful of customers with this issue--for some reason there is a TCP/IP network error when Post Master attempts to query SQL Server.  If you Google the error, you'll see lots of people have this issue under many different circumstances.  It is a low level network error being returned by Windows or SQL Server, and there is very little I can do to fix it directly.  It could be a network driver, NIC card, switch issue, or maybe even a wiring issue.  But because Post Master happens to be the application that detects and logs the error, the customer expects me to be able to resolve it.

Another very common challenge is setting up SMTP so that Post Master can email posting reports or error notifications.  SMTP poses many challenges, but the most common is that most non-IT folks don't understand SMTP (and even some technical folks don't understand it).  With a name like "SMTP" this isn't surprising.  SMTP Server, Port, and SSL / TLS are somewhat technical values that a typical end user wouldn't know about.  And then you have SMTP relaying, which adds another layer of complexity.  And then there is relaying to internal vs. external email addresses.

A customer just contacted me, saying that emails to internal addresses are delivered fine, but emails to external addresses generate an error.  Looking at the Post Master logs, I found this explanation:

The server response was: 5.7.1 Unable to relay

The customer's SMTP server only allows relaying to internal addresses, so I asked the user to speak with their IT team to determine the best approach for relaying messages to an external address.  Unless you've administered an Exchange or SMTP server (which fortunately I have), these concepts aren't obvious.

Then there is SQL Server security.  Post Master has some custom SQL tables and stored procedures and a custom database role to manage access to those objects, independent of DYNGRP.  We have had a few customers with very strict SQL Server security auditing that automatically deletes all SQL objects that are not on an approved list.  No joke.  Or manual audits that resulted in security roles and permissions being deleted.  So we've had to figure out why Post Master's SQL security or objects have disappeared, and then figured out that it was a security audit that caused the problem, and request reviews and exemptions from the audit team.

We had one government client that was having a FIPS compliance audit who needed us to modify the internal Post Master encryption to allow them to pass their audit.  Fortunately, I know what FIPS is, I understand encryption algorithms, and knew exactly how to modify Post Master to meet their requirements.  And I just happened to know the history of and the difference between Rijndael and AES, so I was able to figure out why the rarely used and negligently undocumented Microsoft Windows FIPS compliance feature was complaining about Post Master's encryption.  That's an entire world of technology that represents about 1% of Post Master functionality, but unless you fully understand it, you can't properly support it.

Then there is the very broad category of compatibility.  First, there is Windows.  I have found that User Account Control (UAC) produces such strange and inconsistent behaviors that I never know when it is causing the issue.  When something odd happens, I ask the customer to try using Run As Administrator to see if that helps.

And don't forget good old Binary Sort Order in SQL Server.  During one Post Master release, we accidentally had a SQL script that referenced a lower case table name, and a customer with Binary sort received an error.

We have come across other Dynamics GP ISV solutions that result in a mix of .NET 2 and .NET 4.x, requiring us to handle .NET compatibility issues in configuration files and at run time.  And then there are direct product incompatibilities, where an ISV solution will cause errors with Post Master, or prevent a GP batch from posting properly.

Then there are the overzealous consultants.  I maintain the Post Master user manual, which is now 41 pages in total.  It has grown over time as I have documented additional troubleshooting tools and techniques, compatibility issues, you name it.  I have even created an installation video and a configuration video, both of which provide step by step instructions.  But occasionally there is an eager user, often a consultant, who glosses over the manual and attempts to use their own brand of kung fu to resolve errors.  They will find a configuration file and edit it, despite the fact that nowhere in the user manual does it mention editing configuration files.  They will start modifying SQL Server security.  They will do things that I would have never anticipated, turning a 30 second resolution into a 15 minute support call.

So when it comes to supporting a software product, you need to know a whole lot more than just your product, and in many cases, you need to know them intricately--as in you need to be an expert.

This simple list doesn't begin to convey the depth of knowledge required, but I think it's a list of categories that serve as a starting point of what I need to know to support Post Master:

SQL Server
Dynamics GP
Other ISV solutions

Other products will likely need to have other ancillary areas of expertise, such as data sources, peripheral hardware such as bar code scanners, external web services, you name it.

I suppose the same thing applies to Dynamics GP consulting.  Customers expect you to solve any problem even remotely related to Dynamics GP, and those issues can have virtually nothing to do with Dynamics GP, other than the fact that they are causing a GP error.  But one difference with supporting your own product is that you don't typically bill by the hour for that support--so you really need to have the knowledge to resolve issues quickly, even those that aren't directly related to your product.

Break out  your WireShark network packet capture tool and get to work!

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Wednesday, November 11, 2015

A baffling bug in a modified SOP Transaction Entry window

By Steve Endow

A customer called me today asking for help with a GP customization.

They have a modified SOP Transaction Entry window and a modified SOP User Defined Fields Entry window.  The SOP Trx Entry window has two custom buttons and a little bit of VBA script, while the SOP UDF window has some fields hidden to simplify it, along with a custom button that opens a custom VBA window.

Pretty basic modifications--nothing special.

The client demonstrates that when they open the SOP UDF field, enter some values in the fields, then click on OK, then save the parent SOP Invoice, the SOP UDF values are not saved.

If they open the same invoice and then open the SOP UDF window, the fields are all blank.

Hmm, that's interesting.

But wait! There's more!

They tested different scenarios, and figured out that if they do not use the modified SOP Trx Entry window, the modified SOP UDF window does save properly.

Yes. You read that right.  If they revert to the standard GP SOP Trx Entry window, their modified SOP UDF window saves values and retrieves them properly.

So, it seems that for some reason, the modified SOP Trx Entry window breaks the SOP UDF window.

I ran a SQL Profiler trace on the SOP UDF window while using the modified SOP Trx Entry window, and here is what I saw.

VALUES ( 0, '', '1900.01.01' ....

Notice that SOPTYPE = 0 and SOPNUMBE = ''

And that statement produces an invalid record in SOP10106.

When I disable the modified SOP Trx Entry window, but keep the modified SOP UDF window, things return to normal.


VALUES ( 3, 'STDINV2263' ....

Notice that the values for SOPTYPE and SOPNUMBE are populated.

Very, very strange.  Somehow, the modified SOP Trx Entry window is preventing the SOP UDF field from having the SOP Type and SOP Number values.

This set of modified SOP windows works fine on their old GP 2010 environment, but exhibits this problem on their current GP 2015 environment.  The modified forms did have some errors and problems when upgraded to GP 2015, so that may be a clue, and I don't know if they might have made a change that caused this issue.

But regardless, I found it a rather strange problem to have.

My recommendation is to recreate the modified SOP Trx Entry window from scratch and see if that solves the problem.  If not, also recreate the modified SOP UDF window from scratch.  The modifications shouldn't take long to reproduce, and I suspect that will be much easier and cleaner than trying to figure out the specific cause of the problem.

I'm always amazed at the strange things I continue to see.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Monday, November 9, 2015

GP Consulting Developers: How do you manage your dev VMs?

By Steve Endow

If you're lucky, there's GP 2010, 2013, and 2015.  Visual Studio 2010, 2013, and 2015.  If you aren't lucky, you've still got GP 10, or earlier.

Plus there's Analytical Accounting, Project Accounting, Manufacturing, and other lesser used modules, which are hassles unto themselves.

Then there are those clients with rather involved environments or third party modules or web services that are dependencies for your development.

How do you manage the many environments required to be a Dynamics GP developer for many customers?

When I first started using Hyper-V, I believe had one general purpose GP 10 development VM.  I would then build a separate dedicated VM for some customers.  Building customer specific VMs gets old pretty quick.  It takes up quite a bit of disk space, and if they aren't frequently used or dormant for a year or more at a time, they become an administrative hassle.

I've now streamlined my VMs a bit and have what seems like a more manageable VM setup, but it isn't perfect.  I now have a total of 19 VMs setup on my server, but fortunately I only typically use 9 or 10 of them.

My primary VMs are:

GP2010 - Primarily VS 2010
GP2013 - Primarily VS 2013
GP2015 - Currently VS 2013, but hope to start using VS 2015 soon

GPPA - For Project Accounting, with GP 2010, 2013, and 2015 all installed in separate SQL instances

GPAA - For Analytical Accounting,  with GP 2010, 2013, and 2015 all installed in separate SQL instances

I still have a few client-specific VMs.  One client used Project Accounting, but I just transferred that code to my GPPA server.  It's very difficult to test customizations and integrations without the client's database, but so far it has been managable.

The other client specific VM has a pretty elaborate HR + Payroll setup in GP.  I could potentially migrate that one as well, but will probably wait until they upgrade to GP 2015.

While my "consolidated" VMs, one for each version of GP, generally work, there are some drawbacks.

1. It's often difficult to remember which server any given client project is on. Was it GP 2010? Or did they upgrade? Did they upgrade to 2013 or 2015?  I often have to dig up my documentation on each project to see what version the customer is on.

2. Third party products are a hassle on a shared VM.  For a GP 2013 customer, I have a third party product that has a dictionary, AddIns, and a web service.  Sometimes I have to disable the third party products to work on other client projects, as they change or disable standard GP functionality.

3. Multiple projects on one GP instance can be a hassle.  If I have 5 customizations using Modifier, but need to DAG just one, I have to make a backup of everything, strip out four of those Modifier customizations, and then DAG the single modified form.  Then I eventually have to go back and reimport the other modifications.  Or if I need to test something on an unmodified form, or make a new customization of the same form, I have to swap between them.  It can get messy and is difficult to keep such an environment clean and organized.

4. I've been trying to dig into SBA, and given its setup requirements, I've put that on a dedicated VM so that I can have Active Directory, IIS, and all of the other pile of prerequisites that it requires present without having to mangle my other VMs or create any dependencies on other VMs.

5. Given the size of the consolidated VMs, the 16GB in my current server limits the number of VMs I can have running at once.  So I regularly have to save one VM before launch another one.  I'm planning on upgrading my server to 32GB, which should resolve this limitation.

There are other small annoyances, like managing different types of test data, but these are five that happened to come to mind at the moment.

I have never really looked into Diff Disks or whatever they are called now with Hyper-V.  If I were a Hyper-V guru, I suppose there is some option to have a base VM and then work with the multiple diffs, but I don't know how much easier that would be to manage multiple Diffs, or if performance would be an issue.

Anyway, I'm curious if anyone has any tips or techniques that has made managing multiple Dynamics GP VMs for multiple customer projects more manageable or easier.

UPDATE:  I was just reminded of another situation that my VMs don't currently handle:  Deployment testing.  A client tried to install a customization I developed, but the customization had a dependency on some SQL Server SDK DLLs.  The GP workstation doesn't have SQL Server or the SDK DLLs installed, so naturally my customization doesn't work.  So I now need to configure YET ANOTHER machine that has the full Dynamics GP 2015 client, but does NOT have SQL Server on it.  Seems like the possibilities never end.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

High Volume GP Integration Tip: Increment your batch ID every X thousand records

By Steve Endow

I've developed several high volume Dynamics GP integrations for customers that needed to import tens of thousands of transactions at a time, and I also work with customers who use Post Master Enterprise to automatically post their batches--many of whom have very high volume integrations.

While the process of importing 20,000 or even 100,000 transactions into Dynamics GP is not terribly different than importing 100, there is one thing that can be challenging with such high volume GP integrations:  Batch Posting.

eConnect imports can shove a lot of data into Dynamics  GP very quickly, but once the import is done, you are stuck with a massive pile of transactions to post in Dynamics GP.  And those batches can take a long time to post.

One thing I always recommend to customers who have to import thousands of transactions at a time is to have their import break up the data into multiple Dynamics GP batches.  So instead of having a single batch with 40,000 transactions, have eight batches with 5,000 transactions, or even 20 batches with 2,000 transactions.  There is no magic number for batch size, but I would personally keep them under 5,000 per batch.

I recommend this because if there is a batch posting failure or an error that causes the posting to stop, you won't be holding up 39,000 transactions.  If the batch size is 2,000 transactions, a posting failure will only delay posting for those 2,000 transactions, while 38,000 transactions in the other batches can continue to post.

So how do you split your import of 40,000 transactions from a single data file into multiple batches?

I'm glad you asked!  It's quite easy:  You just change your batch ID.

In your import, you will maintain a transaction counter that gets incremented every time a transaction is imported.  After that counter hits X thousand records, you change your batch ID and reset your counter.

I prefer to make the batch size a configuration option in my integrations.  So the customer can change it to any value they want at any time.  Then, when I assign the batch ID to the transaction, I check the counter.  If the counter hits 5,000 (or whatever value is configured), I increment the batch ID suffix, like INV12345-1, INV12345-2, INV12345-3.

If you have a .NET eConnect integration, this is just a few simple lines of code.  If you are using SmartConnect or Scribe, you would probably need a few global variables to store your batch size and transaction count, and a few lines of script in your Batch ID field.

To me this technique of managing batch size seems very obvious, but I have had dozens of phone calls with customers who are importing tens of thousands of transactions into a single batch, and then stressing over the batch posting process at 1am, because a failed batch posting would take them hours to re-post.

Skip the stress--split your batches.

Happy importing, and merry posting.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Sunday, November 8, 2015

SmartConnect Calculated Field VB Script to generate unique cash receipt check numbers

By Steve Endow

A Dynamics GP client has a situation where they may receive multiple payments from a customer with the same "check" number on different days.  So on Monday November 2, they might receive check "12669", and on Wednesday November 4, they might receive another payment marked with the same check number "12669".

I don't know the background of the source data, but it's coming in electronically (such as from a lock box, bank web site, or other cash management system), and it is being imported using SmartConnect.

If you attempt to import a customer check number that already exists in GP, SmartConnect will spew this eConnect error at you.

"Duplicate check number for this customer already exists in RM00401"

So, in the Dynamics GP import tool that doesn't require any programming, you'll need to do some programming if you want to work past this error.

First, open the Mapping window and click on Additional Columns.  Then add a new Calculated column.

I called mine the overly long but self explanatory "UNIQUECUSTOMERCHECKNUMBER".

In the calculated column, paste in this script.  I would have written it in C#, but the client's other SmartConnect integrations have some VB, so I went with VB.

This is my inaugural SmartConnect script, so excuse me if I'm doing anything wrong or sloppy.  And I'm sure there are at least 3 different ways to do this in SmartConnect, so if you have a better way, post a comment and let me know.

As a side note, one reason why I hate "scripting" in any tool is that it is typically a euphemism for "Here's an empty dumb window with no Intellisense or feedback about the code you are writing".  It's a glorified version of Notepad.  It makes coding horribly tedious compared to Visual Studio, as modern IDEs no longer require you to memorize every excruciating detail about the language syntax.  So if you also find such scripting windows painful, just write the script in Visual Studio first, and then paste it into SmartConnect.

        Dim connString As String
        Dim commandString As String
        Dim recordCount As Integer
        Dim dateString As String

        connString = _SQLDEF_GP2013

        Dim conn As System.Data.SqlClient.SqlConnection = New System.Data.SqlClient.SqlConnection(connString)

        commandString = "SELECT COUNT(*) AS RecordCount FROM RM00401 WHERE CUSTNMBR = '" & _CUSTOMERID & "' AND RMDTYPAL = 9 AND CHEKNMBR = '" & _CHECK_CARDNO & "'"

        Dim cmd As System.Data.SqlClient.SqlCommand = New System.Data.SqlClient.SqlCommand(commandString, conn)
        cmd.CommandType = System.Data.CommandType.Text

        recordCount = Convert.ToInt32(cmd.ExecuteScalar())

        dateString = Microsoft.VisualBasic.Strings.Right("0" & System.DateTime.Now.Month, 2) & Microsoft.VisualBasic.Strings.Right("0" & System.DateTime.Now.Day, 2) & Microsoft.VisualBasic.Strings.Right(System.DateTime.Now.Year, 2)

        If (recordCount > 0) Then
            Return _CHECK_CARDNO & dateString
            Return _CHECK_CARDNO
        End If

This script queries the RM00401 table to count any cash receipt records for the customer and given check number.  If it finds any, it appends the date (MMDDYY) to the end of the check number value and returns it back to the Check Number field in the Mapping.

Please note that I've highlighted two variables above.  The first one, _SQLDEF_GP2013, is the SmartConnect default connection string for my server.  You will want to change that variable to point to your server.

In the Edit Calculation window, expand Default Connections -> Generic -> MSSQL Default to find your connection string variable.

The second variable I've highlighted, "_CHECK_CARDNO" is the name of the check number field in my particular source data file.  Yours will likely be different, so you'll need to change that to match your check number field.

I suspect that this situation of having duplicate customer check numbers is probably unusual, so you may not need to use the script for this specific scenario, but the script design would be useful for other situations where other duplicate document numbers may need to be imported.  You would just need to change the SQL query to work for your particular document type.

Happy coding.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter