Thursday, February 12, 2015

eConnect will import data with leading spaces: Why should you care?

By Steve Endow 

 One fine day, you joyfully work on a simple Dynamics GP customer query.  You just want to look up the record for a customer.

SELECT * FROM RM00101 WHERE CUSTNMBR = 'WEB001'

Couldn't get much simpler.

But when you run the query, you get zero results.


Hmm, that's odd.  You open GP to verify the customer exists.


Yup, there it is in the correct company database.  You double check your query and make sure you are querying the correct database in SQL Server Management Studio--and everything looks okay.

So, what is going on?  How can the customer exist in GP, but not show up in a simple SQL query?

Look a little closer at the customer record in GP.


Well, there's the problem right there.  Do you see it?  

When entering data directly into Dynamics GP, if you try and type a space as the first character in any field, nothing happens.  You just can't enter a field value that starts with a space in the GP user interface.

But I have discovered that eConnect doesn't have such a restriction, and it will dutifully import data values that begin with a space.  It doesn't seem like a big deal, until you try and query that data.

In SQL, these two queries are different:

SELECT * FROM RM00101 WHERE CUSTNMBR = 'WEB001'

SELECT * FROM RM00101 WHERE CUSTNMBR = ' WEB001'

The leading space on the customer ID makes all the difference.  With the first query, I get no results.  With the second query I find the record that has a leading space on the customer ID.

Honestly, I don't know that I actually realized this distinction previously--it just isn't something I have had to try, use, or deal with.  Trimming white space is such second nature for a developer that I can't remember ever thinking about it.  It seems obvious in retrospect, but I think it's just one of those incredibly elementary assumptions that becomes invisible to you after so many years of development. 

When you open the Customer lookup window, you can see that a leading space also affects sorting.


I made up this specific example to demonstrate the issue with eConnect and GP.  In all of my integrations and code, I habitually use a Trim() command on all strings to trim leading and trailing white space, so I was actually quite surprised that eConnect doesn't trim leading spaces.

But this topic came up because of a similar leading space issue showed up on a customer integration this week, and I was quite surprised.

I was surprised because having done software development for about 20 years now, I can't recall encountering this issue before.  While I may have encountered data that had a leading space, my code always trimmed leading and trailing spaces, so a leading space never resulted in a problem.

But in the case of my customer, they had populated a custom table with a few rows that had a leading space.  That leading space prevented a SQL WHERE clause from working, preventing records from being retrieved and imported.

My integration retrieved invoice " ABC123" (with leading space) from the invoice header table in SQL.  It then trimmed the invoice number and queried the invoice line table for all lines related to invoice "ABC123".  As you can guess, the line query retrieved 0 records.

The client and I spent several minutes reviewing the integration and the data trying to figure out why one invoice wouldn't import.  I eventually noticed that the data looked slightly shifted in the SQL query results grid.

It's subtle, but if you have other records above and below, it is much easier to spot.


Once we discovered the leading space in the invoice data, the client removed the spaces and the integration worked fine.

Paradoxically, the issue was caused because my integration code was trimming the invoice number.  Instead of using the actual " ABC123" value from the invoice header table, the trimmed version of "ABC123" caused the problem.  But that turned out to be fortunate, since I now know eConnect would have imported the value with a space, which would have certainly caused headaches in GP later on.

So, the lessons are:

1. Leading spaces in data can be problematic in GP (or just about any database or application)
2. The GP application doesn't allow leading spaces, but eConnect will import them
3. Always trim leading and trailing white space for all of your imported data

Keep learning!


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Who deleted my Dynamics GP database table and stored procedure.?

By Steve Endow

An unusual situation has come up at a client.  While reviewing some integration logs, I saw that a custom database table had been deleted from the production Dynamics GP company database.  The integration log recorded an error from a stored procedure--the proc was trying to insert a record into the table, but the table no longer existed.

An unexpected error occurred in SaveUpdateCCHistory: Invalid object name 'cstb_Update_CC_History'.

Since the procedure was still present, but the table wasn't, we were pretty confident the table had been deleted, since you can't create a proc that refers to a non-existent table.

Very strange.  We recreated the table, and moved on.

Then a few days later, while reviewing the integration logs in the Test company database, we saw an error indicating that a call to a custom stored procedure was failing.

An unexpected error occurred in InsertCustomerEmailOptions: Could not find stored procedure 'cssp_New_Customer_Email'

Sure enough, that procedure was missing from the Test database.

These objects have been in both databases for weeks, if not several months, so we hadn't touched them, and certainly didn't have any processes or scripts that would delete them.

The client said they couldn't think of any way the objects would be deleted.

These mysteries are always difficult to research "after the fact".  The ideal solution is to have third party SQL Server auditing software that records such activity and lets you review it later when a problem occurs.  But since such SQL administration is relatively rare with GP clients, we usually have limited tools to research such issues.

After some Googling, I found a few articles on how you can query a database transaction log to determine when a database object was dropped and who dropped it.

But there are two big caveats:

1.  The fn_dblog and fn_dump_dblog functions are not documented or supported, have been shown to have bugs, and can result in some unexpected consequences.  So you should use them very cautiously.

2.  The fn_dblog and fn_dump_dblog functions read database transaction log activity.  So if the database has had one or more full backups since the drop, you are likely out of luck and will not find any information about the dropped objects.

Technically it is possible to read from a database backup file, but such files typically do not have much log data to work with, so the odds of finding the drop data in a backup file are slim.

Also, technically it is possible to use the functions to read directly from transaction logs, but I don't think I've ever seen a GP client that intentionally or systematically backs up their SQL transaction logs, so that is typically a long shot as well.  Usually, once a full DB backup is performed, the transaction logs get cleared.

But, aside from those rather significant limitations, I was still pretty impressed that it is possible to determine when an object was dropped, and who dropped it.  I'm guessing it will not be very helpful in a real environment where you may not know an object was dropped for a few days, but if you discover the problem quickly, you can try it and see if it works.


Below is the script that I used to test with the TWO database.  I create a test table and insert some rows.  I then backup the database and restore to a different database name.  I then drop the table.

After the table is dropped, I can query the transaction log to see that an object was dropped, who dropped it, but I can't tell which object was dropped (since it no longer exists).



To get the name of the dropped object, you have to restore a backup of the database, then use the dropped object ID to query the database (where the object still exists), to see the object name.




USE [TWO]
GO
--Create table
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[TestTable]') AND type IN (N'U'))
BEGIN
CREATE TABLE [dbo].[TestTable](
[FirstName] [varchar](30) NOT NULL,
[LastName] [varchar](30) NOT NULL,
[RowID] [int] IDENTITY(1,1) NOT NULL
) ON [PRIMARY]
END
GO
--Populate table
INSERT INTO TestTable (FirstName, LastName) VALUES  ('Chuck', 'Norris')
INSERT INTO TestTable (FirstName, LastName) VALUES  ('George', 'Washington')
INSERT INTO TestTable (FirstName, LastName) VALUES  ('Britney', 'Spears')
--Verify table
SELECT * FROM TestTable

--****************************
--BACKUP DATABASE NOW AND RESTORE TO NEW DB
--****************************
--Drop the table in TWO
DROP TABLE TestTable

--Get info on drop from transaction log
SELECT
[Begin Time],
[Transaction Name],
SUSER_SNAME([Transaction SID]) AS UserName,
[Transaction Id],
[Transaction SID],
[SPID],
(SELECT TOP (1) [Lock Information] FROM fn_dblog (NULL, NULL)
WHERE [Transaction Id] = fndb.[Transaction Id]
AND [Lock Information] LIKE '%SCH_M OBJECT%') AS ObjectID
FROM fn_dblog (NULL, NULL) AS fndb
WHERE [Transaction Name] = 'DROPOBJ'
GO

--In the prior query, review the ObjectID field values.  The object ID is the numeric value at the end, in between colons
--HoBt 0:ACQUIRE_LOCK_SCH_M OBJECT: 6:1554182124:0
--In this example, the object ID is:  1554182124

--Insert numeric object ID and run on restored copy of DB
USE TWO01
SELECT OBJECT_NAME(1554182124) AS ObjectName



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Monday, February 9, 2015

How do you backup your Hyper-V virtual machines?

By Steve Endow

So, how do you backup your Hyper-V virtual machines?  You DO back them up, don't you?

Today I spent a rare day working on my "data center infrastructure".

Specifically, I was working on my Hyper-V backups.  For the last several years, I've used batch files and VBS scripts to save my VMs and then copy the VHDs to a file server.  While not fancy, this actually worked okay for me for several years.

But since I retired my Windows file server and replaced it with a Synology NAS, I stopped compressing my VHD files, and started to also implement offsite backups as well.  So my VHD files get copied to the Synology, then the Synology has a backup job to copy the VHDs to external drives that are rotated each week.  Since I was no longer compressing the VHD files, my 2TB external drives were filling up, so I had to finally come up with a better system.

While copying VHD files seems adequate for basic situations, there are some challenges.  The obvious downside is compression.  Even if you have plenty of storage, having terabytes of backups sitting around isn't ideal.  And in my case, my limitation was not my main storage, but my external drives that I rotate each week.  In addition to capacity, managing multiple backup versions on the external drives was a hassle.

I looked into using Windows Backup on each VM, but if you attempt to backup to a network share (a NAS, in my case), Windows Backup only supports a single full backup.  It will not do differential backups, and it will not retain multiple backups on a share.  So that option was quickly ruled out.

I then searched for Hyper-V backup software.  After perusing the search results and a few forum posts, I probably found a dozen different names mentioned.  There are a lot of options and I couldn't possibly evaluate them all, or even a half dozen.  But two names seemed to have a focus on Hyper-V, and seemed to have positive feedback.

The two I chose to evaluate are:

  • Veeam Backup & Replication (apparently pronounced like "veem")
  • Altaro Hyper-V Backup


I've done some initial testing of both products, and both vendors offer a free limited version, as well as a full trial version.  Both appear to work well, but they seem to be pretty different products in terms of their features and presumed target market.

Veeam seems to be focused on "higher end" features and enterprise requirements with a full set of configuration options for nearly everything, while Altaro has a very good set of features and a simpler design that can get you running backups in a few minutes with less configuration.

Veeam seems to install a ton of "stuff"--it's installer is over 800 MB, with a 139 MB service pack.  The Veeam solution takes quite a while to install, requiring .NET 4 and SQL Server Express, among other things.  Altaro is a relatively svelte 144 MB, with a very fast, simple installation.  It seems obvious that the Veeam installation is larger and more involved because it has some pretty advanced and powerful features--a few of which I don't understand.  The question I have to answer is whether I want those features and whether the potential overhead is worthwhile.

I don't believe that my comparison of the two products will be focused on whether one product is "better" than the other.  It will be a question of whether they can completely fulfill my current requirements, fulfill any possible future requirements I can think of, how easy they are to use and manage, and how much they cost.

From the list prices I've seen, Veeam is a little bit more expensive than Altaro, but offers some additional feature and more flexibility, such as with the configuration of multiple backups and destinations.  The difference is only a few hundred dollars, so price alone isn't a primary driver for me between the two.

My initial impressions are that, objectively, Altaro is probably the more appropriate solution for me.  While I may like a few of the more advanced features and configurability of Veeam, those items are probably not essential for my business.  Nice to have?  Sure.  Critical?  No.  Overhead that I probably don't need to be thinking about given my simple requirements?  Probably.

But would I like just a little more configurability in Altaro?  Yes.  And while it is probably not important technically, I find Veeam's backup files and folder structure to be fantastically simple, while Altaro produces a messy pile of directories and files that look baffling.

My guess is that this difference is due to how the two manage their backup logs and histories.  Veeam appears to use SQL Server Express, so it can maintain metadata in SQL and have a very clean backup file and folder structure, whereas my guess is that Altaro stores the backup metadata on the backup drive, so that's why you see dat files and GUIDs on the backup drive.  Once I do some restores I suspect I'll have a better feel for the merits of each approach.

I just started evaluating the two products today, so I'm going to compare them over the next two weeks to see which one might be a better fit.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Thursday, February 5, 2015

The value of logging and diagnostics in Dynamics GP integrations

By Steve Endow
 
I developed a custom web service for a customer that allows one of their internal web sites to create customers in GP and store customer credit card data in a third party credit card module for GP.

During the testing process, we had to troubleshoot various issues with code, APIs, data, etc.  As part of that process I used my standard logging library, which writes messages and errors to a simple daily text log file on the server.  I used the log to troubleshoot issues, then I resolved them, and eventually the project went live and I pretty much forgot about the detailed logging that I had setup.

Today, the customer did some additional testing on their Test web server, and he said that he encountered a problem.  He sent a screen shot from his web site, but it didn't provide any information or error text.

I then reviewed the GP web service log file and was surprised to see all of the detailed logging, which is automatically enabled in the test environment.

2/5/2015 10:56:18 AM: UpdateCC Instantiating UpdateCCResponse
2/5/2015 10:56:18 AM: UpdateCC Calling ValidateUpdateCCRequestHMAC
2/5/2015 10:56:18 AM: UpdateCC Calling ProcessUpdateCCRequest
2/5/2015 10:56:18 AM: UpdateCC Calling UpdateCustomer
2/5/2015 10:56:23 AM: UpdateCC Calling GetCustomerOpenBalance
2/5/2015 10:56:23 AM: UpdateCC UpdateCC Setting regRequest values
2/5/2015 10:56:23 AM: UpdateCC Getting company ID
2/5/2015 10:56:23 AM: UpdateCC Getting customer profile ID
2/5/2015 10:56:24 AM: UpdateCC Getting CC expiration
2/5/2015 10:56:24 AM: UpdateCC Getting customer payment profile ID
2/5/2015 10:56:24 AM: UpdateCC Getting customer payment profile info
2/5/2015 10:56:24 AM: UpdateCC Updating Authnet payment profile
2/5/2015 10:56:29 AM: UpdateCC Calling ImportCC
2/5/2015 10:56:29 AM: UpdateCC Saving update CC history
2/5/2015 10:56:29 AM: UpdateCC Checking amount paid and approval code
2/5/2015 10:56:29 AM: UpdateCC completed ProcessUpdateCCRequest
2/5/2015 10:56:29 AM: UpdateCC Process UpdateCCRequest returned True
2/5/2015 10:56:29 AM: WARNING: UpdateCC elapsed time: 10.57


Reviewing the entries, there were no errors, and it looks like everything worked properly.  But on the last line, I saw the warning, indicating that the process took over 10 seconds.  That likely explained the problem the customer experienced.  I added a timer to each method in the web service to write to the log any time it takes longer than 10 seconds to process the request.

If you have ever worked with eConnect, you likely know that the first time you call an eConnect method, it can take 5-10 seconds for it to respond.  Once the eConnect service is 'alive', it responds very quickly, but eventually it shuts down again, and there will be a delay when it processes the next request.

Since the log file told me that this was the first request of the day, I suspect that eConnect took several seconds to respond when I updated the customer information, and after 10 seconds, the customer's web site timed out, thinking that the GP request had failed.  Looking at the time stamp in the log, you can see that the eConnect request started at 10:56:18, and was completed 5 seconds later by 10:56:23.  We also see that the call to Authorize.net took another 5 seconds, pushing the process over the 10 second threshold.

The eConnect delay occurred in the test environment because there isn't much regular activity.  In the production environment, we have a process that checks the status of eConnect, SQL, Authorize.net, and the third party credit card module every minute.  Because that process talks with eConnect every minute, the eConnect service always stays 'alive' and we avoid the startup delay.

Just a quick example of the value of logging and some basic diagnostics in a Dynamics GP integration.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Friday, January 23, 2015

Automatically close Dynamics GP report destination windows in VS Tools

By Steve Endow
 
  I am working on a project that involves automating a process in Dynamics GP.  Since there is no API for the particular steps I need to automate, I'm having to use the low-tech method of opening a GP window, populating some field values, and clicking a few buttons.

This is fairly easy to do with Visual Studio Tools for Dynamics GP...in concept.

But, unsurprisingly, Dynamics GP seems to always have to have the last word, and such seemingly simple projects are rarely easy.

In my particular case, after I performed the last button click on the GP window, Report Destination windows would appear to print posting reports.


The process I am automating generates not one, or two, or four, but FIVE report destination windows.  Yes, it is possible to turn off the reports so that the window doesn't appear, but that change would affect many other processes throughout GP where the client might want to actually print some of the posting reports.  It is possible to temporarily update the posting report settings in SQL to disable them, and then quickly re-enable them, but that method runs the risk of disabling at the same time that another user needs the report.

Unfortunately, the Report Destination dialog boxes are not standard modal dialogs that would be detected by the BeforeModalDialog event handler in VS Tools.  So, some "off roading" is required.

When I ran into this problem I was lucky enough to find this excellent post by Paul Maxan on the Code Project web site.

http://www.codeproject.com/Articles/801319/Closing-Microsoft-Dynamics-GP-Report-Destination-w

Paul wrote a great article and did a very nice job crafting code that uses the Windows API to detect the Report Destination dialog box and use windows handles and messages to click through the dialogs.

He posted some very nice code that I packaged up into a separate class that can be easily invoked just about anywhere in your VS Tools project.

During my testing on GP 2013, I found that his combination of TAB + ESC didn't work for some reason.  After some fiddling, I found that ESC by itself did work, so I have used that in my GP 2013 project.

The Report Destination dialog boxes still flash on screen as they are dispatched by Paul's "Closer" code, but it seems to work well and definitely gets the job done.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Thursday, January 15, 2015

Bug in Dynamics GP VS Tools Event Handlers for DexUIForm Windows

  By Steve Endow

A few days ago I wrote about an odd behavior in VS Tools where an event handler wouldn't fire if a VS Tools window initiated an event on a GP window.

Specifically, on the GP 2013 Void Historical Payables Transactions window, when you click on the Mark All button, two different dialog boxes appear. (the dialogs are not present on GP 2010)


I needed to suppress those two dialogs.  I wrote an AddIn with an event handler for the BeforeModalDialog event, which worked perfectly when I clicked the Mark All button in GP.

pmVoidPaymentsWindow.BeforeModalDialog += new EventHandler(PmVoidPayments_BeforeModalDialog);

But if I had a VS Tools window "click" that button, the event handler did not fire.  At all.

I was stumped.  After hours of testing and trying various scenarios, I figured it was a bug, and I would have to develop a workaround.  I tried 2 or 3 different workarounds--all of which worked fine in GP, but none of which worked when a VS Tools window clicked Mark All.

For instance, I added VBA script to the window to see if VBA could take care of the dialogs.

Private Sub Window_BeforeModalDialog(ByVal DlgType As DialogType, PromptString As String, Control1String As String, Control2String As String, Control3String As String, Answer As DialogCtrl)

    If Left(PromptString, 61) = "Your selection may include payments that have been reconciled" Then
        Answer = dcButton2
    ElseIf Left(PromptString, 69) = "Your selection may include credit card payments with related invoices" Then
        Answer = dcButton2
    End If
    
End Sub

Once again, this worked fine if I manually clicked the button in GP, but when my VS Tools window clicked the button, the event handler would not work and the dialogs would appear.

I briefly considered trying Continuum, but I couldn't easily figure out what script to use to click through the dialogs.  I momentarily thought of using a macro, but didn't want to go there.

While desperately searching for a solution, I begged for help from Andrew Dean and Tim Wappat, both of whom are expert .NET and GP developers.  They both confirmed the issue and were puzzled.  They came up with a few workarounds to try, but nothing seemed to work.

I then finally resorted to using the Windows API to detect the dialog boxes and click through them, completely outside of the GP dev tools and VS Tools.  After a day of research and testing, I finally got that to work.  It was a kludge, but it did work, literally clicking the Continue button of the dialog boxes as they flashed up on screen.

And then today, Andrew created a VS Tools test app that was able to get the BeforeModalDialog event to fire properly.  Before I had a chance to review it, Tim Wappat had reviewed the code and found the reason:  Andrew had used a standard Windows Form rather than the VS Tools DexUIForm for his test window.

Sure enough, when I modified my test window to inherit from the Form class rather than the DexUIForm class, the BeforeModalDialog event handler worked properly when my window clicked the Mark All button.

public partial class FrmVoidTest : Form  //DexUIForm

So there it was--an apparent bug in the DexUIForm class caused the event handler for the Void Historical Payables window to not fire.  And strangely, it only seems to occur with some windows--for instance, the problem does not occur with the SOP Transaction Entry window.  So there is something about the Void Historical Payables Transactions window that triggers the bug.

Unbelievable.  It took me 3 full days to try several different workarounds and it turns out that changing one word on line line of code was the solution.  It's problems like this that make me frustrated with Dynamics GP.  Incredible waste of time for such a seemingly small problem, yet that problem is critical, and prevents me from completing my project.

The only good thing is that I learned quite a bit about the plumbing of VS Tools and Dynamics GP windows, and I am now also familiar with the Win API code to detect, analyze and automate actions on Windows applications.  But it was a painful journey.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







Installing missing Professional Services Tools Library stored procedures

By Steve Endow

One of my clients was trying to run the Professional Services Tools Library (PSTL) Vendor Modifier tool on GP 2010.  When they tested it in their test database, they received this message:


"Could not find stored procedure smVendorChange1"

We checked the database, and the stored procedure did not exist.  As a test, I created the SQL script for that procedure from my server and we ran it on their database--we received messages indicating that there were other dependencies, so it seems that none of the PSTL procedures were present in this test database.

After doing some reading, I found that PSTL should automatically create its stored procedures.  But it clearly hadn't done so with this database.

We then tried using the SQL Maintenance window to recreate the procedures.


Unfortunately, that didn't appear to do anything.  None of the missing procedures were created.

We then logged into a different company database and opened the PSTL window.  PSTL immediately launched its status window and created the stored procedures.  Hmmm.

Puzzled, we logged back into our test database and launched the PSTL window.  It didn't create the procedures automatically.  Out of curiosity, we clicked on the Register button.  The default registration code was displayed, so that didn't seem to be the issue.


But when we clicked on the OK button for the Registration code window, a status window displayed and PSTL started to install its stored procedures!


I don't know why it didn't automatically install them when it launched, but if you run into a situation where the stored procedures are missing or need to be reinstalled, try clicking on Register, then OK, and it should trigger the reinstall process.

With GP 2013 (or 2013 R2?), the PSTL tools are included and do not require a separate install, so I'm assuming this issue is only relevant for GP 2010.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter