Friday, November 15, 2013

Unable To Open Select Human Resources Windows?

I ran across an odd case when I was on site with a client a couple of weeks ago.  The users could navigate easily to all of the Human Resources window.  But when they navigated to Employee Maintenance window, clicked Human Resources, and the tried to click Contact....nothing happened.  Nothing.


After doing some troubleshooting, we found that this issue only occurred when users had access to the Project Accounting alternate form for Employee Maintenance.  If we granted access to the standard Microsoft Dynamics GP Employee Maintenance window instead, we did not have the problem (Administration Page-Setup-System-Alternate/Modified Forms and Reports ID).

The good news in all of this is that the bug is apparently resolved in GP 2010 Service Pack 4 which was recently released.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Thursday, October 24, 2013

When good backups just aren't enough

By Steve Endow

Today I was hoping to start a new development project.  I blocked out my day to focus on the project and was ready to dig into some code.

This morning I fired up one of my HyperV virtual servers that was setup for this particular type of development and opened Google Chrome to login to a web site for the project.  Chrome was setup to load Google.com by default, but when the browser opened, the page didn't display--it said something about "unable to establish ssl connection".  Hmmm.  I then tried a few other web sites.  Some would partially load, some would display text but no images, some would display scrambled images, and some wouldn't load at all.

I thought maybe it was a network issue--since I moved recently, all of my servers detected my new network and required me to select the network type before the network started working properly.  But even after checking the network settings, pages still wouldn't load.  I then tried Internet Explorer and Firefox, but both of those browsers exhibited the same strange symptoms.  So it seems the problem was with the machine, and was not a browser issue.

I then tried to install an application on the machine, but the setup.exe would immediately crash.  Hmmm.

I thought maybe there is a small chance that some type of virus or malware got onto the server, although that seems highly unlikely for this particular VM.  So I tried to install my anti-virus--but I couldn't install that either--the setup just crashed immediately no matter what I tried.  I tried several other things, but none of them worked.  This virtual machine that worked fine just a few months ago was now unusable.

So I then pulled up a backup.  Oh, did I mention I'm a big fan of backups?


In addition to the above backups for my workstation files, I have scripts that backup all of my HyperV virtual servers every week.  A copy of each VHD is saved to my file server, then the VHDs are compressed using 7-Zip.  A copy of the 7-Zip archives are then saved to an external drive.  And I have two external drives that I rotate each week to a fire safe.

So on my file server I grabbed the VHD backup from 10/11/2013.  I restored the VHD and restarted the VM.  But the same problems existed.

I then went back two weeks, and restored an older VHD, but it had the same issues.  I was then able to resurrect a backup from 9/14, nearly SIX WEEKS old, but alas, even that had the issue.

So this was an interesting case where despite my very comprehensive backup system that worked great and allowed me to easily restore a copy of a 64GB VHD from six weeks ago, it apparently was not old enough.

You might think that this is a rare exception, but based on my experience, it's actually fairly common.  I've had clients discover that they have a problem that is several months old.  Or they discover that something was deleted or changed or overwritten months ago.  In those cases, their 4 week backup rotation doesn't help them.  They would have needed monthly archives saved for 6 or more months.  In my experience, these types of "quiet" problems have been much more common than any dramatic disaster, drive failure, or server meltdown.  Many backup strategies focus on restoring a recent copy of a file, but fail to consider the possible need to restore an older copy of a file.  It's a more difficult backup strategy, requires more management, and requires more storage space.

But it can be valuable in situations like what I experienced today.


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Tuesday, October 15, 2013

Year Over Year Comparison in SQL

Either we are slacking, or we are both so busy lately! We had a great time at Technical Airlift, and I am gearing up for the Partner Event (GPCC) and GPUG Summit next week in lovely Tampa, FL.  In the meantime, here is a little bit I put together based on a Victoria Yudin script to pull an account balance as the back end to a dashboard to display year over year balances by GL account.  So this crosses both open and history years...

--CTE to pull current year account balances by year
With CTECurrentYear (Account, AcctDesc, Balance, GLYear) as
(select a.actnumst Account, m.actdescr AcctDesc, ISNULL((sum(g.DEBITAMT-g.CRDTAMNT)),0) Balance, g.OPENYEAR GLYear
from GL20000 g
inner join GL00105 a
on a.ACTINDX = g.ACTINDX
inner join GL00100 m
on g.ACTINDX=m.ACTINDX
where a.ACTNUMST IN ('list out the GL accounts here') group by a.actnumst, m.actdescr, g.openyear),

--CTE to pull historical year account balances by year
CTEHistoryYear (Account, AcctDesc, Balance, GLYear) as
(select a.actnumst Account, m.actdescr AcctDesc, ISNULL((sum(g.DEBITAMT-g.CRDTAMNT)),0) Balance, g.HSTYEAR GLYear
from GL30000 g
inner join GL00105 a
on a.ACTINDX = g.ACTINDX
inner join GL00100 m
on g.ACTINDX=m.ACTINDX
where a.ACTNUMST IN ('list out the GL accounts here') group by a.actnumst, m.actdescr, g.HSTYEAR)

--Select statement to union both together
Select account, acctdesc, balance, GLyear from CTECurrentYear
union all
select account, acctdesc, balance, GLyear from CTEHistoryYear
order by account, GLYear

Hope to see you in Tampa!

Thursday, September 26, 2013

Dynamics GP SQL Server Composite Primary Key Trivia

By Steve Endow

Coincidentally, after working on a really interesting Dynamics GP SQL Server performance issue earlier this week, I will be having a call tomorrow with another GP partner and GP customer to discuss a different Dynamics GP SQL performance issue.

This time, it involves an eCommerce web site that is trying to stay synchronized with Dynamics GP.  I don't have much information on the integration at the moment, but I do have a list of requests from the web developers.  They are asking that several indexes be added to Dynamics GP.

When I reviewed their request for new indexes, I looked at the GP tables to see what indexes already existed on the tables.

One example is that they requested an index for CUSTNMBR on the SOP10107 table.  The SOP10107 table has a composite primary key (a key consisting of multiple fields), but does not have any additional indexes (at least not on my GP 2010 TWO database).

So the developers are apparently asking for an index on CUSTNMBR, the first field in SOP10107.

But CUSTNMBR is one of the fields in the primary key.  Doesn't SQL Server use the primary as an index?  I didn't actually know and had to do some research.

The answer is one of my favorites:  It depends.

Apparently, if you query the first field in a composite primary key, or all fields in the composite primary key, the primary key can be used as an index.  But if you query the second or subsequent fields in the key, then the primary key cannot be used as an index.

I need to track down the MSDN or TechNet documentation to back this up, but I initially found it explained in two threads on StackOverflow:

http://stackoverflow.com/questions/4381921/composite-primary-key-and-additional-indexes

http://stackoverflow.com/questions/3613103/indexing-individual-fields-of-sql-server-composite-keys


I haven't researched the specific mechanics of the primary key in SQL Server to understand exactly why this limitation exists, but I have speculated about the reason and can understand why it might be the case.

So there you go.  Now you can talk fancy about SQL Server composite primary keys at your next social gathering!  You will be the life of the party.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Wednesday, September 25, 2013

A most EPIC tale of troubleshooting a Dynamics GP SQL performance issue

By Steve Endow

Today was a very good day.  After hours of effort by a team of three consultants, we solved a vexing Dynamics GP SQL performance issue for a client.

I'll start with the conclusion and explain why this was such a great achievement.

1. We actually identified the cause of the performance problem.  For those that have tried to identify the cause of GP running slowly, locking up, or not responding, you know how difficult this can be.  Working together, our team of 3 was able to conclusively track down the specific SQL stored procedure that was causing the problem.  This was quite an accomplishment.

2. It took the combined effort of three consultants with three different areas of expertise and a lot of combined Dynamics GP, networking, and SQL Server experience to identify the cause.  In my experience, it is very, very, rare to have the opportunity to assemble such a team to work on this type of issue, and it was a great experience for all of us.

3. The client was willing to invest in consultants and made the commitment to identify and resolve the issue.  So often I've had situations where a client was plagued with performance issues, but there wan't a strong commitment to resolve them.  The client will likely spend several thousand dollars resolving this particular issue, but I believe it was a worthwhile investment--or I will once we finally resolve the issue.

And with that, some background...

The Dynamics GP 2010 client contacted their partner in the San Francisco Bay Area a few months ago complaining about performance problems with Dynamics GP.  GP is sporadically slow, and sometimes hangs or stops responding completely.  The "lock up" can last from 5 minutes to 20 minutes and affects multiple users at different times.

The GP partner looks into it, but because the issue is so unpredictable and can't be reproduced on demand, no cause is identified in Dynamics GP or SQL.  Apparently the problem continues and gets even worse.  At this point, the client requests a network expert to try and determine if a network issue is causing the problem.  The GP partner identifies a networking data communications consultant in Denver and flies him out to San Francisco this week.

After two days of network traffic monitoring and traces, the network consultant, working with the GP consultant, is able to capture the actual network packets that are occurring in the seconds leading up to the performance issues in Dynamics GP.  Like a countdown, he is able to list the users, the actions they were performing in different applications, and the data in the network packets that were being issued by those users at the exact time that the problem occurred.

This data indicated that a revenue recognition application used by the client was issuing queries against the Dynamics GP database.  He assembled the contents of the network packets, and saw that the "rev rec" application was querying the Extender "EXT" tables in Dynamics GP.  When implementing Dynamics GP and the revenue recognition system, the client chose to store additional sales order invoice field values in Extender windows that could be used to perform rev rec calculations.  So this all makes sense.

Since the network consultant had uncovered a pile of SQL, the Dynamics GP consultant called me to decipher the SQL activity and determine how it might relate to the performance issue.  Coincidentally, my blog-mate Christina Phillips and I just presented last week at the Tech Airlift 2013 conference in Fargo, and the topic of our presentation was "SQL Skills for Consultants".  While I am by no means a super-SQL-guru, I regularly get calls like this to solve some tricky problems, so I have had a fair amount of experience in this area.

So today I had a few GoToMeeting sessions with the GP consultant and network consultant where we pored over the SQL activity and dug into the Dynamics GP company database to understand what was going on.

We saw that the rev rec application issued a select statement against the EXT00100 table, and then called a custom stored procedure.  Immediately after that stored procedure is called, Dynamics GP locks up for one user.  A few seconds later, GP locks up for another user, and then another.  So the network consultant had identified our culprit:  a custom stored procedure.

After reviewing the network trace data, the GP consultant knew the specific actions that were being performed in the rev rec system and in Dynamics GP, so she was able to get the users to reproduce the issue.  Suspecting that we had a locking issue on our hands, I sent her a slightly modified version of the "dynamic management view" query from this Stack Overflow thread that identifies blocked and blocking processes in SQL Server.


SELECT
t1.resource_type,
t1.resource_database_id,
t1.resource_associated_entity_id,
t1.request_mode,
t1.request_session_id,
s1.login_name, 
s1.program_name, 
s1.host_name, 
s1.nt_user_name, 
t2.blocking_session_id,
o1.name 'object name',
o1.type_desc 'object descr',
p1.partition_id 'partition id',
p1.rows 'partition/page rows',
a1.type_desc 'index descr',
a1.container_id 'index/page container_id'
FROM sys.dm_tran_locks as t1
LEFT OUTER JOIN sys.dm_os_waiting_tasks as t2
      ON t1.lock_owner_address = t2.resource_address
LEFT OUTER JOIN sys.objects o1 on o1.object_id = t1.resource_associated_entity_id
LEFT OUTER JOIN sys.partitions p1 on p1.hobt_id = t1.resource_associated_entity_id
LEFT OUTER JOIN sys.allocation_units a1 on a1.allocation_unit_id = t1.resource_associated_entity_id
LEFT OUTER JOIN sys.dm_exec_sessions s1 ON s1.session_id = t1.request_session_id
WHERE o1.name IS NOT NULL



When she sent me the results of the query, it painted a great picture of the problem.  Here is a screen shot of the query results--I know it looks busy, but stay with me, it's pretty simple to interpret.


Notice that the highlighted row is the only one with a value in the "blocking" column.  That means that an application is trying to perform an operation on the EXT00100 table, but is being blocked by another process.  Notice that the "request_mode" column on that row says "IX".  This means that the row in yellow is waiting for an "exclusive" lock on the table, in this case because it is trying to update the EXT00100 table.

When a SQL process is blocked, it typically causes the calling application to wait--it will often just sit there and be completely unresponsive as it waits for the SQL command to finish.  The SQL operation usually finishes in a fraction of a second, but if the process is blocked, it may wait for a minute, or perhaps indefinitely, until the operation completes.

The Dynamics GP consultant knew that particular yellow row was a Dynamics GP user--the user reported that GP was locked up.  So this told us that something was locking the EXT00100 table, and that is why Dynamics GP was locking up--it was waiting indefinitely for the table to become available for an update.

Which leads us to the records for process 178.  178 is the process that is blocking the update to EXT00100.  But why is it blocking?  Well, if you look at the "request_mode" values, most have a value of "S", which means "shared lock".  Shared locks allow other processes to read the locked data, but the data cannot be changed or updated while the shared lock is present.  So process 178 appears to have issued SELECT queries against the EXT tables, with the default shared lock, that are preventing Dynamics GP from updating those tables.

Fundamentally, this isn't a problem--a query can retrieve data from the EXT tables, complete in a fraction of a second, and then the shared lock is released.  But for some reason, the shared lock of process 178 persisted for up to 20 minutes at a time.  During that time, Dynamics GP would become unresponsive if a user tried to update any Extender records.

So, now that we know why GP is locking up, the next question is: Why is a mysterious query against the EXT tables locking them for 20 minutes?  For that, we go back to the network packet trace, and find packet 33823 documented by the network consultant:

Packet 33823 – hq-app1 issues a SQL command to run a Stored Procedure on the SQL Server 

Exec CORP..spREVREC_EXTENDER 'CORP\username', '2013-09-24 03:35:30:35', 'EXT_REVREC'


Aha, now we are getting to the root of the problem.  There is a custom stored procedure called "spREVREC_EXTENDER" that is being called--this procedure is what is locking the tables under process 178.  We deduce that this custom stored procedure was developed by the revenue recognition software vendor to extract data from Dynamics GP for analysis.

When then jump back to SQL Server Management Studio and pull up that custom stored procedure so that we can view its contents.  Except...it is encrypted.  D'oh!

Never fear, as I have a tool that can decrypt stored procedures with a single click.  After some file transfer acrobatics, we finally get a copy of the customer database onto my development machine so that I can use the decryption tool.  And voilà, we can now see the contents of the stored procedure.  And it is a doozie.  It is 160 lines with 2 derived tables and 12 joins.  (Attendees of our SQL Skills for Consultants presentation at the Tech Airlift would know that a good substitute for a derived table is a CTE, and that CTEs can be helpful in improving query performance!)

So now it is crystal clear what is happening.  The stored procedure, with its 2 derived tables and 12 joins, is taking forever to run.  It was developed against a brand new Dynamics GP company database that only had a few records--of course it ran fine back then.  Now that the customer has 100,000 records in the SOP tables and each of the Extender tables, it takes anywhere from 5 to 20 minutes to run.  It's a classic example of how it is often impossible to predict future performance of a complex query, and why SQL query optimization is important.

But that's not all.  It isn't just that it has 12 joins--it's the particular table that it is joining and the fields it is using in the join.  This is just a small snippet of the query--to anyone familiar with Extender tables, it should be fairly obvious that this isn't going to end well.


from EXT00100 e
INNER JOIN EXT00102 d1 ON d1.PT_Window_ID = e.PT_Window_ID and d1.PT_UD_Key = e.PT_UD_Key and d1.PT_UD_Number = 1 
INNER JOIN EXT00102 d2 ON d2.PT_Window_ID = e.PT_Window_ID and d2.PT_UD_Key = e.PT_UD_Key and d2.PT_UD_Number = 2
INNER JOIN EXT00103 t1 ON t1.PT_Window_ID = e.PT_Window_ID and t1.PT_UD_Key = e.PT_UD_Key and t1.PT_UD_Number = 3
INNER JOIN EXT00101 s1 ON s1.PT_Window_ID = e.PT_Window_ID and s1.PT_UD_Key = e.PT_UD_Key and s1.PT_UD_Number = 6
INNER JOIN EXT00101 s2 ON s2.PT_Window_ID = e.PT_Window_ID and s2.PT_UD_Key = e.PT_UD_Key and s2.PT_UD_Number = 7
INNER JOIN EXT00103 t2 ON t2.PT_Window_ID = e.PT_Window_ID and t2.PT_UD_Key = e.PT_UD_Key and t2.PT_UD_Number = 4
INNER JOIN EXT40102 m 
ON m.PT_Window_ID = t2.PT_Window_ID 
and m.Field_Number = t2.PT_UD_Number 
and m.LNITMSEQ = t2.TOTAL
INNER JOIN EXT00103 t3 ON t1.PT_Window_ID = e.PT_Window_ID and t3.PT_UD_Key = e.PT_UD_Key and t3.PT_UD_Number = 5
INNER JOIN SOP30200 h 
ON h.SOPNUMBE = rtrim(e.Key_Strings_2) 
-- and h.SOPTYPE = convert(INT,e.Key_Strings_3)-1
INNER JOIN REV_REC_MASTER a
ON a.SOPNUMBE = h.SOPNUMBE 
AND a.ORIG_SOPTYPE = h.SOPTYPE
AND a.LNITMSEQ = convert(INT,e.Key_Strings_1)
AND a.VOID_FL = 0
INNER JOIN REV_REC_RULES b
ON b.SOPNUMBE = h.SOPNUMBE 
AND b.ORIG_SOPTYPE = h.SOPTYPE
and b.LNITMSEQ = convert(INT,e.Key_Strings_1)
and b.DELETE_FL = 0
and b.VOID_FL = 0
LEFT OUTER JOIN REV_REC_TRX_RULES r
ON r.SOPNUMBE = a.SOPNUMBE and r.LNITMSEQ = a.LNITMSEQ


I haven't yet dug into the query to determine the specific causes of the performance issues, but one leading candidate is the join with "rtrim(e.Key_Strings_2)".  That is a join using a function on a non-indexed field.  I sense trouble ahead.

At this point, we have identified the cause of the GP lockups and know that we need to have the rev rec vendor modify the custom stored procedure.

I'm guessing that a quick fix would be to simply add the "WITH (NOLOCK)" table hint to the query.  In theory, this should eliminate the shared lock and allow GP to update the EXT tables without issue.  But that would just be covering up the underlying problem--the stored procedure would still take 5 to 20 minutes to run.  And using NOLOCK on a query that runs that long is probably a bad idea.

So we will likely need to work with the rev rec vendor to completely restructure the query so that it runs much faster.  Preferably in just a few seconds.

Despite the work ahead, we felt pretty good by 5pm today, as we were confident that we had found the cause of the client's performance issue, and knew that it could be fixed.

It was a good day.


Postscript:  Some of you may know that with GP 2013, eOne completely redesigned the table structures for Extender.  The tables have different names, and work differently than the prior Extender tables.  So, when this client upgrades to GP 2013, the revenue recognition stored procedure will have to be redesigned yet again to query data from the new 2013 Extender tables.


For more information on different types of SQL locks, here is a good TechNet article on SQL locking:

http://technet.microsoft.com/en-us/library/aa213039(v=sql.80).aspx


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


How NOT to develop an eConnect integration: Lesson #29

By Steve Endow

I've come across some really bad Dynamics GP integrations.  I had one where the developer made a really pretty user interface with their own company logo and fancy controls and dynamic layout, but the import simply didn't work and the developer didn't know the difference between a debit and credit.

Then I saw another one where the developer wrote their own massive stored procedures to import GL and AP transactions into GP tables (I'm hoping that was done before eConnect existed).  Scary stuff.

I got a call last week about an old integration that stopped working when a customer tried to set it up on a new Windows 7 x64 machine.  The integration fails with two errors.  The first appears to indicate that it encountered an error when it tried to read the source data file.  And the second error says that the Crystal Reports runtime is not installed correctly.

Say what?

I got a copy of the .NET executable and decompiled it to try and find some clues.  I found the section of code that is failing when reading the data file, but it isn't obvious why it is failing--the developer didn't bother to output any error messages in their error handling code, just a useless "There were errors loading data" message.

So then I move on and look for the Crystal Reports reference.  Sure enough, the application relies on Crystal Reports version 10.2.3600.  For those of you with long memories, Crystal Reports 10 is from around 2004.  I have a copy of it, but my version isn't 10.2.3600, so I'm guessing it won't help.  So now we're stuck digging around an old machine to try and find the right Crystal DLLs and see if we can get the import working again, and worrying that there may be some compatibility issue with Windows.

So why does a GP integration use Crystal Reports?  Apparently the developer thought it would be super cool if he used Crystal Reports to display errors.  No joke.  So if the import has an error, the application launches a Crystal Reports runtime window to display the errors...in a Crystal Report.

So in a nice paradox, the application is encountering an error, and it then tries to display the Crystal Report to present the error, but Crystal Reports isn't working, so we get another error, and we are unable to see what the first error is.

They could have gone with a dialog box. They could have used a text box.  They could have written to a text log file.  They could have sent an email.  All of those options would have been much simpler, easier, and worked just fine.  But no, they had to get all fancy and bill the client a few more hours to build a Crystal Report just to display errors.  Face palm.  And now that 2004 version of Crystal Reports is ancient and is preventing the app from working on a newer version of Windows.

If you are developing an eConnect integration, or any simple business application for that matter, please just keep it simple.  There is no need to Bedazzle a basic application like a data import.  And when the application lasts for years and years and needs to be upgraded as the client upgrades GP, the simple application will be easier to upgrade, maintain, and support.

The developer who wrote this integration with Crystal Reports did not impress anyone, yet left a legacy of hassle and frustration due to a poor design choice.


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Sunday, September 22, 2013

Want to use cloud computing? Make sure and have an exit strategy!

By Steve Endow

I had lunch with a friend today who manages internal IT systems for a large film and television media company.  He said that he was swamped at work because of an unexpected business crisis:  one of his company's cloud storage service providers is going out of business and he has less than 30 days to try and transfer all of the critical data stored with the provider.  He was not involved in the selection or setup with this provider, but is now having to figure out how to save all of the data.

The fun part:  His company has over 160 TERABYTES of data stored online with this provider.  (It is believed that other customers have many PETABYTES of data stored with this provider, so relatively, 160 TB isn't nearly as bad.)

The provider uses "object storage", so there is no simple way to request all of the files and have them transferred to physical drives.  And the data is encrypted, so even if he is able to move all of the data somewhere else, he will likely have to purchase the same file encryption appliance to ensure that the file encryption and decryption processes work the same.

Despite having a gigabit network connection to the cloud storage service, he's finding that it is taking too long to download the data.  And his company doesn't exactly have an empty 160 TB enterprise grade storage device laying around.  Given the crisis, he's having to simultaneously explore three options to try and save the data.  He explained they are trying to download the data to their corporate network, but it is too slow, so that doesn't look too promising.  So they are also simultaneously building a new storage colocation setup in the same data center as the cloud storage service to try and utilize the internal 10GB connectivity in the building.  But to do that, they are having to borrow a storage device and encryption appliance from EMC--since they can't purchase one quickly enough.  And the third option is to get setup with an alternate storage provider, such as Amazon S3.  If you have a ton of data on a storage device, apparently you can physically ship the storage devices to an Amazon data center and they can import the data directly to their systems rather than trying to upload it via the Internet.

Given his limited time window, he may have to use all three approaches simultaneously.  Oh, and don't forget that his business still needs to function--they still need a place to store the gigabytes of new data that are constantly being generated while this crisis unfolds.

After learning more about the "cloud" industry and attending several conferences, he shared a key lesson that he picked up from a Gartner Research study:  If you plan on adopting cloud computing, make sure to have an exit strategy.

If you backup or store data to the cloud, how can you retrieve ALL of that data if necessary?  If you have hosted ERP or CRM, how would you handle a complete shutdown of that service provider?  Do you have a plan?  Do you have multiple options?  How long will it take to execute the plan?

If you use hosted Dynamics GP, it could be as simple as getting a copy of your SQL Server database backups and a backup of your hosted Dynamics GP application directory.  Assuming the hosting provider allows you to access or receive copies of such backups regularly.  And assuming you know the process and are ready to turn those database backups into a fully functioning Dynamics GP environment using either internal servers or a new hosting provider.  How long would it take to implement such a plan?

What about hosted CRM?  What about hosted email?  How about your web site?

"Cloud computing" has come a long way in the last 10 years, but it still has some challenges that require prudence and diligence.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Friday, September 20, 2013

Another Great Dynamics GP Technical Airlift, in Scenic Fargo!

By Steve Endow

This week Christina and I attended the Technical Airlift 2013 in lovely Fargo.  The weather mostly behaved, but varied from 80 degrees and foggy to 45 degrees with some rain.

The conference was great, with reviews of the features and enhancements added with GP 2013 SP2, some overviews of upcoming functionality, and even a few sneak peaks of technology improvements that are potentially a few years away.

Christina and I had a very successful presentation, "SQL Skills for Consultants".  We had a full room, a great audience, lots of nifty T-SQL tips and tricks, and some very good questions throughout the presentation.  Several Starbucks gifts cards were awarded to the audience for their tips and questions, and there were even a few hecklers who found a few typos in our SQL examples.  ;-)   The room full of partners and consultants kept us on our toes!

I'll try and write a few follow-up post to answer some questions that we had during the presentation.

It was great to meet several new people, catch up with old friends, and learn about the impressive new features and enhancements that are being added to Dynamics GP.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Friday, September 6, 2013

Technical Airlift Here We Come! SQL Tips and Tricks

Hopefully we will see some of our ardent followers at the upcoming Microsoft Technical Airlift for Dynamics GP in Fargo, ND later this month. Steve and I are presenting a session on SQL tips and tricks for consultants. Definitely not a developer-oriented session, we hope to help front line consultants and support folks build our their SQL toolboxes to up their game in terms of queries, reporting, and data fixing.

 It has been fun working on the presentation, and I thought I would share some of the fun things we have come across in a few posts over the next couple of weeks. Here are a couple things from the Cool SQL Keywords section of the presentation...

Much like a middle child, BETWEEN is often forgotten although it saves us precious keystrokes and is so simple to understand. It can be used in place of using ">=" and "<=" in a WHERE clause.

For example,
 "select * from RM00103 where CUSTBLNC >=10000 and CUSTBLNC <=100000"
Becomes...
"select * from RM00103 where CUSTBLNC BETWEEN 10000 and 100000"

So clear cut, right?

Now, the second on we chatted about a bit was UNION vs UNION ALL. Now, the little know fact is that UNION actually causes duplicates to be eliminated. In reality, we use UNION a lot to marry up open, history, and maybe work tables. So we wouldn't expect to have any duplicates, right? But it is worthwhile to know this, as it may cause an issue if you are data fixing and resolving duplicate issues...or don't realize you have some.

So in that case, you may want to err on the conservative side and use UNION ALL which does not eliminate duplicates. Simple enough...

SELECT * FROM PM20000 UNION ALL SELECT * from PM30200

We will share more as it gets closer, and if you are coming to Fargo stop by our session and say Hi!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, August 30, 2013

Checking SQL permissions on many SQL Server objects

By Steve Endow

I am working with a Dynamics GP client that is having an odd issue with an ISV solution related to SQL server permissions.  The ISV solution is designed to only allow one Dynamics GP login to access its tables and stored procedures--so that only one instance of the application is running at a time.

Normally this works fine and I've worked with dozens of clients without issue.  But one client is having an issue where the application is reporting that a second user has activated and de-activated the application.

We've checked the settings and checked the SQL Server database role used by the application and everything appears to be setup correctly.  But before I contacted the developer, I wanted to check one last thing--the permission on the individual SQL objects.  Even though the application uses a database role to manage its permissions, we have seen at least one third party GP product perform permission updates on every object in the GP databases, causing problems with this application.

So how do you check the permissions on a SQL object?  If you only need to check one or two objects, you can use SQL Server Management Studio.  Just right click on an object, such as a table, and select Properties.  Then click on the Permissions page.

Here is an example of a table used by Post Master for Dynamics GP.  It does not use the DYNGRP role, and you can see that only the rPostMaster database role is assigned to the table.


This approach of checking permissions works, but it is very tedious to do this for multiple objects.  I needed the GP client to check permissions on dozens of tables and stored procedures, and I didn't want to have to send instructions and a list of every object.

So after some Googling, I found some SQL queries that allow you to query the permissions on ranges of objects based on name.

So for tables, you can run this query:

sp_table_privileges 'ESS%'

This displays all of the permissions for tables that begin with ESS, which are the tables used by Post Master.

For stored procedures, it is a little bit more involved, but fortunately someone on Stack Overflow had posted the script:

SELECT OBJECT_NAME(major_id), USER_NAME(grantee_principal_id), permission_name
FROM sys.database_permissions p
WHERE OBJECT_NAME(major_id) LIKE 'zDP_ESS%'


You can modify the queries to look for objects starting with RM, PM, SOP, GL, etc., or all objects.

When you run these queries you get a nice list of all permissions that you can quickly scan to identify anomalies.

I was able to send both queries to the client, and a few minutes later, I had the results and was able to verify the permissions on all Post Master database objects.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Saturday, August 24, 2013

NetSuite vs. Dynamics GP: Navigation: Browser Tabs vs. Application Windows

By Steve Endow

For background, please see my introduction to this series, NetSuite vs. Dynamics GP: A Series

If you are familiar with the Dynamics GP client application, you know that you can navigate throughout it much like any other Windows application.  You have the main application window, and you use menus or shortcuts or other "navigation" methods within the application to open new windows and sub-windows or child windows.

Here I have four different Dynamics GP client application windows open.


In Dynamics GP, it is possible to open a bunch of different windows as you perform different tasks, but in my experience, most users tend to have a handful of windows that they work with regularly.  If you have more than 2 or 3 windows open at a time, it usually gets confusing and switching between them becomes tedious.  Some users keep a few of those key windows open all the time, while others, myself included, like to use one window at a time and close extra windows as soon as they are done with them, always keeping the windows to a minimum.

In NetSuite, things are a little different.  First, NetSuite is a native web application, so you only access it with a web browser.  You can use a single web browser tab, and always navigate within that one window, or you can right click on menu items or application "links" and select Open in New Tab.  Just as this feature is very handy when doing normal web browser, the Open in New Tab feature is very convenient within NetSuite.  If you are working on a transaction and want to open another window to lookup a value or perform some other action, it's simple to do in a new browser window, not too different than opening another window in Dynamics GP.

Here I have six tabs open, each with a different NetSuite "window".


What is different about the NetSuite browser tabs is that if you start clicking on links in those tabs, you travel down a path and can go just about anywhere within the NetSuite application.  So the tab that was a customer list now displays a customer invoice.  Or the tab that displayed a financial report now shows a journal entry.  When I have multiple tabs open and start clicking links in each of them, I quickly got lost in the different tabs.  I often click a link rather than right click, and before I know it, my tabs have completely different content than a minute ago.  Eventually, I have to close several tabs and start over, resetting the tabs to specific windows.

After using NetSuite for a while, I found that the navigation model between the Dynamics GP client application and the NetSuite browser-based navigation each had pros and cons.  In Dynamics GP, when you open windows, you eventually reach a dead end.  For instance, you might open a Sales Transaction window, then open the Customer window, then open the Customer Accounts window.  All of those windows stay in place until you close them, and eventually you reach an end point.  And you can only open one Customer window at a time--so if you try and open it again, it pops to the foreground.  This simplifies the user experience somewhat, since each window has a single purpose and unique name, so you can have multiple windows open and pretty easily know which is which.

With NetSuite, you have the nice feature of being able to right click on a menu item or link and open it in a new browser tab.  You can have multiple customers open in multiple tabs, or multiple invoices, or multiple reports or inquiries.  The navigation is very flexible and fluid, allowing you to go wherever you want from anywhere.  This is pretty appealing, but as I noted, you can quickly get lost in a pile of browser tabs.

And one downside of the NetSuite browser based approach.  If you encounter an error when submitting information, it tends to mess up the application "state" in that window.  So if you attempt to click on the Back button on the error message web page, or use your browser's back button, the page may not be current or valid, and may not work properly until you refresh or navigate to a new page.  When I was testing and debugging a custom SuiteScript solution and encountering errors, such "back" problems became very frustrating, since it required me to navigate back a few pages and start my testing over earlier in the process than I would have liked.  But, despite this, there was the plus that I can login to NetSuite from any computer with a web browser.

Obviously, GP 2013 now offers a web-based client, so for now I have only compared the original GP client application to the NetSuite browser based solution--obviously not a perfect comparison.  I haven't yet set it up on my test servers, so I'll have to revisit this navigation comparison once I get the GP web client setup.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Tuesday, August 20, 2013

Importing Payroll Posting Accounts

Payroll posting accounts can be fairly straightforward, but for some users the combination of code/department/position can create a seemingly neverending matrix of possibilities- particularly when factoring in the use of ALL for some of the components. Although the payroll posting account setup window is not terribly difficult to use, it is a little unwieldy when trying to enter and proof a long list of variations. Table import can help with this, by allowing users to enter and review the information in Excel.

Now, I am not going to go in to detail about using table import, Tools-Import-Table Import, except to say that you should always test in a test database first. In the case of importing payroll posting accounts, I would go so far as to build (which should flush out any true errors), calculate, print, and post a payroll (which will flush out anything entered incorrectly).

So, to import payroll posting accounts, you will be importing to the Payroll Accounts Setup table (UPR40500). And you will need the following fields in your upload file (which can be in Excel, and then saved as a text tab-delimited or CSV file).

1. Department (Must match department code exactly in GP)
2. Job Title (Must match position code exactly in GP)

And then we have the three more complicated ones...

3. Payroll Code In a lot of cases, this would be the Pay Code, Deduction Code, or Benefit Code (based on the UPR transaction type which is explained next).

But in some cases, this would also be...
Tax Code (State)- When used with the transaction type, State Tax Withholding, SUTA Payable or FUTA Payable.
EFIC/M, EFIC/S, FED, FICA/M, FICA/S - When used with Federal Tax Withholding (the codes that start with E are the employer side)
FIC/ME, FIC/SE, FUTA, SUTA - When used with Employer Tax Expense.

So that brings us to..
4. UPR Transaction Type This has different values that correspond to the dropdown list in the Payroll Posting Accounts Setup window. You will use the number noted in your file.

Gross Pay (DR)- 1
Federal Tax Withholding (CR)- 2
State Tax Withholding (CR)- 3
Local Tax Withholding (CR)- 4
Deduction Withholding (CR)- 5
Employer's Tax Expense (DR)- 6
Benefits Expense (DR)- 7
Benefits Payable (CR)- 8
Taxable Benefits Expense (DR)- 9
Taxable Benefits Payable (CR)- 10
SUTA Payable (CR)- 11
FUTA Payable (DR)- 12
Workers Comp Tax Expense (DR)- 13
Workers Comp Tax Payable (CR)- 14

The last field in the file would be...

5. Account Index. This is easily retrieved from the GL00100 or GL00105 table, or you can just add Account Index to the Accounts Smartlist.

It is important to note this is NOT the account number, but the index that is a incremental unique number assigned to each account in the database. When in doubt, go set a few up manually and then do a select * from UPR40500 to check out the values and how they populate. This can go a long way to making your table import successful.

One additional side note, make sure you include all of the required payroll posting account combinations in your upload (refer to help from the Payroll Posting Accounts Setup window for a list of them).

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, August 16, 2013

Two potential Dynamics GP design limitations for larger or growing companies

By Steve Endow

I recently spoke with a Lawson consulting firm.  One of their Lawson customers acquired a company that uses Microsoft Dynamics GP.  The Lawson client manages 3 companies in their system, while the GP client manages 9 companies.

In Lawson, the 3 companies are managed in a single database.  But in Dynamics GP, the 9 companies are managed in 9 separate company databases.  The client was debating whether to continue to use Lawson, or to transition to Dynamics GP.  The Lawson partner's concern was whether the client would be willing to transition to a system that kept the companies in separate databases and required them to switch companies all the time.

Obviously, there are thousands of Dynamics GP customers that have multiple company databases and manage them without issue.  And I know of several companies that have over 200 active company databases in Dynamics GP.  I am not sure how they handle that many companies and retain their sanity, but they apparently manage.

Anyway, the Dynamics GP design where each company uses a separate SQL Server database is often considered a limitation or weakness for multi-company entities that are evaluating ERP systems.  Some companies don't mind multiple databases, whereas others consider it a significant weakness and hindrance, perhaps due to administration, reporting, data entry, etc.

Sheldon Gitzel recently discussed this issue and a few possible solutions in a blog post on the Etelligent ERP Blog.

So if any GP competitor claims that GP can't effectively manage multiple companies, such a claim is clearly not true--the solution is just different than some larger ERP systems.  Systems that manage multiple companies in the same database may have some advantages, but such a design also has potential disadvantages, such as having to deal with a single very large database.

So that is one potential concern for a large company.

A second Dynamics GP design limitation is lack of global support for "effective dates".  If an employee's pay or benefits or deductions are going to change next month, there is no option to enter those changes in Dynamics GP with a future effective date.  Many "higher end" systems have native global support for effective dating.  With effective dating, an inventory item can be discontinued on a future date, future pay raises can be entered and put into effect automatically, or special prices and discounts can be date-based.  I have only worked with one GP customer that really needed effective dating--they had hundreds of employees and effective dating was critical for their HR and Payroll processes--but as companies grow, such a feature often becomes more important.  If anyone knows of an add-on that allows Dynamics GP to have effective dating, please post a comment and let me know.

Related to the lack of effective dating is that record changes are not tracked in Dynamics GP.  So if an employee or customer or vendor address record is updated, there is no record of the prior value--only the current value is saved.  The customer could purchase an add-on such as Rockton Auditor to track the changes made in GP, but that is an additional $250 per concurrent user and still doesn't provide effective dating.

This isn't intended to be a criticism of Dynamics GP, as these features are typically fundamental design choices and add complexity.  I just wanted to explore two general areas where larger companies might indicate that they need a "larger" or "tier 1" ERP system and claim that Dynamics GP doesn't meet their needs.

Have you observed any situations where a customer felt they couldn't use Dynamics GP because it didn't have a feature of a "larger" ERP system?


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter




Thursday, August 15, 2013

eConnectException: The stored procedure 'ta______' doesn't exist

By Steve Endow

Tonight a customer e-mailed me saying he was receiving this eConnect error message:

eConnectException: The stored procedure 'taPMManualCheck' doesn't exist

He was trying to use my AP Payment and Apply Import utility to bring in historical AP payments and applications, but the eConnect error occurred as soon as the import attempted to submit the transaction to eConnect.

We verified that the taPM stored procedures existed, and then we verified that the import was sending the data to the correct database name.

We then used SQL Profiler to trace the database activity, and saw that other queries were being successfully executed, but the taPMManualCheck procedure never got called.

I then checked the eConnect service, and it was assigned to a domain user.  We then opened SQL Server Management Studio and checked the permissions for that domain user.

It was a member of DYNGRP in the DYNAMICS database and TWO database, but lo and behold, it was not a member of DYNGRP in the company database for some reason.

Once we checked the box for the DYNGRP role in the company database for the eConnect domain user, the import worked fine.

A similar issue can occur if the eConnect user is using a different schema, as Patrick Roth discusses in this post on the Developing for Dynamics GP blog.


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Monday, August 5, 2013

Undocumented eConnect 2010 and 2013 Serialization Flags

By Steve Endow

I just fielded an eConnect question on the incredibly busy Microsoft Dynamics GP Community Forum about an issue with an eConnect vendor import.

The user was trying to set the Vendor Status field on imported vendors, but was unable to set the Status to Temporary.

I opened a .NET eConnect project and checked the properties of taUpdateCreateVendor object and I quickly found a likely suspect for the problem.

In eConnect 2010, the GP documentation and help team did a great job of restructuring the eConnect Help file to make it much easier to navigate.  However, for some reason, they also eliminated one small part of the help file:  the lists of the "serialization flags".

If you open the eConnect 10 help file and pull up the taUpdateCreateVendor record, you will see that some fields have little cross symbols next to them.  Those marks indicate that those fields have separate serialization flags.


At the bottom of the eConnect help page, there was a list of all of the serialization flags.


Once you knew about the serialization flags and knew to be on the lookout for them, this information wasn't terribly important.  Which might be why the serialization flag list was dropped in the eConnect 2010 help file.  But for those poor souls who were creating their first eConnect project in GP 2010 or 2013, this detail would not be at all obvious.

So what is a serialization flag and how to you use it?


I don't know the back story for why serialization flags were used in eConnect, but my guess is that it was to help reduce the complexity of the eConnect serialization code.  By having flags to indicate which less-common fields were being sent in, the code didn't have to evaluate certain values and could skip certain chunks of serialization by default.

In short, it's just a boolean property that you must include if you are setting the value of a corresponding eConnect field.

In this example, if you send in a value for Vendor Status, you must also send in the corresponding vendor status serialization flag.

With the serialization flag set to true, eConnect serialization will include the VNDSTTS field value in the XML that it generates.

Whenever you develop eConnect integrations, you need to pay attention to the intellisense to see if a property has a corresponding "Specified" property.  Since those properties are no longer indicated or listed in the eConnect help file, Intellisense is the only way that I know of to be aware of such properties in eConnect 2010 and 2013.


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter


Wednesday, July 31, 2013

NetSuite vs. Dynamics GP: Save Transactions vs. Post Batches

For background, please see my introduction to this series, NetSuite vs. Dynamics GP: A Series

One distinct difference between NetSuite and Dynamics GP is that NetSuite allows you to simply save transactions to commit them, while Dynamics GP allows you to save uncommitted transactions to a batch, and then post the batch to commit them.

In GP, if you are entering 5 transactions, you can assign them to a batch, which allows you to save them.  The transactions are uncommitted at this point, and can be edited or deleted, depending on your configuration.  The batch is a temporary bucket that allows you to group the transactions and verify the number of transactions entered, as well as the total dollar amount of the transactions.


The batches can require that users enter control totals, indicating that they have independently verified the number of transactions and the total batch amount.  Dynamics GP can also be configured to require batch approvals based on transaction type, which is also handled by the batch window.

While I can understand the benefits of the batch control totals, it has been ages since I've seen anyone actually use that feature--as in use a 10 key to independently total their transactions.  I'm sure there are a few companies that use the feature, but in my experience, it has been very rare in the last 10 years.  I don't know if this is a shift in attitudes towards data entry controls, or if it is due to a higher comfort level with accounting system reliability, or some other reasons.  But it seems as if the batch control totals may be an anachronism and are not as important to businesses as they once were.

The batch approval seems a little more practical.  If an AP clerk enters 10 invoices, a supervisor can have the opportunity to quickly review the transactions, verify them, and then approve the batch prior to posting.

So that is a quick overview of GP batches.  NetSuite has a very different approach:  no batches at all.

When you enter a transaction in NetSuite, you have the option to Save the transaction.


Once you save the transaction it is committed.  No batch, no separate posting step, no control totals.

Is the lack of batches a benefit?  Is the lack of a posting process a good thing--just one less step to deal with?  NetSuite users would probably furrow their brow if you asked them about batches--why would they want or need them?

Since there are no batches in NetSuite, approvals are at the transaction level.  I'm not yet familiar with the approval functionality, but it appears that you can setup a workflow to obtain approval, which is a nice touch.

What is perhaps more significant about NetSuite is that after a transaction is saved, it can be opened, edited, changed, and even deleted.  This is very different than the more 'conservative' approach in Dynamics GP, where a posted transaction cannot be modified or deleted.  Dynamics GP allows some transactions to be voided, and allows GL journal entries to be reversed, but each of those operations is explicitly tracked, and you have a strong audit trail for those changes.

In NetSuite, if someone opens an invoice, modifies it, and then saves it, or perhaps deletes an invoice, I don't yet know what controls or audit mechanisms exist to track or control such situations.  Given the flexibility and customizability of NetSuite, I'm assuming there are several options for adding controls if desired.  And I'm told that one specific control available in NetSuite is to enable fiscal periods, which can prevent transactions from being modified if they are in a prior or closed period.

This unposted-transaction design appears to be very similar to QuickBooks, which is obviously widely used by many companies, so it certainly isn't unprecedented.  But admittedly, I'm so used to the stricter controls in Dynamics GP that it's difficult for me to imagine users having the ability to edit, modify, and delete transactions.

Written By Steve Endow

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

http://www.precipioservices.com

GL Balance Migration - Net Change -vs- Debit/Credit

Yesterday I was discussing prior month GL balances with a client who went live in May. They were looking back at a specific account's activity for January, February, and March, and they were confused by what they saw. For the sake of argument, let's say that they saw a $100 credit for January, a $50 debit for February, and a $200 credit for March. So they were concerned as to why each month would only have credit or debit activity and not both. Of course, it was the end of a long day and I wasn't very quick on the uptake :) Normally, when I migrate GL period activity for the purposes of GL history migration, I try to bring in both debit and credit activity for the period. This way, all three columns of the trial balance match the prior system- Debits, Credits, and Net Change for the period. But in this case, someone else had done the migration and the fact that the prior system didn't easily produce period activity in a downloadable format, only the net change for each period was loaded. Looking at it from a client's point of view, someone who is new to the system, it reinforced for me that migrating both debits and credits may make things clearer when looking back months or years from now. And generally, this takes no additional time in the migration assuming the data is available. What do you all think? Curious to know how you all handle it?
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.


Friday, July 26, 2013

Dynamics GP Run-time Error 1007: Unsafe Operation

Several years ago I developed a customization to display the order weight on the SOP Transaction Entry window.

Earlier this week, the client said that they have been getting an error with the customization in a very specific circumstance.

If they create a Quote with a serialized item, and then transfer that Quote to an Order, they will get the following error:


Run-time error 1007:  Unsafe Operation.  An attempt was made to set a value which violates the application's business logic.

If they use a non-serialized item on the quote, the error does not occur.  And they don't get the error in any other process--only when transferring the Quote with a serialized item to an Order.

This is odd for many reasons, but rather than try and figure out why it occurs, we are lucky enough to have KB article 856199 that describes the problem and offers a solution.

In my original code, I was assigning the order weight value to the custom field on the SOP Transaction Entry form:

TotalOrderWeight.Value = FormatNumber(rst.Fields("ORDERWEIGHT").Value, 2, vbTrue, vbFalse)

Well, it seems that this can trigger the error for some reason.  The workaround is to use the Focus method to assign the value, something I had never used before.

TotalOrderWeight.Focus (FormatNumber(rst.Fields("ORDERWEIGHT").Value, 2, vbTrue, vbFalse))

Sure enough, when I used the Focus method, the error went away.  Go figure.

Written By Steve Endow

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter



Windows 7 Remote Desktop Connection error: Remote computer requires Network Level Authentication

By Steve Endow

In the last week, I suddenly had problems connecting to one of my Server 2008 R2 HyperV virtual machines.  I have been using the virtual machine for many months and have not made any changes, but when I attempted to connect using Remote Desktop Connection, I received this error:


"The remote computer requires Network Level Authentication, which your computer does not support."

I knew that the error message related to the Remote Desktop settings on the Server 2008 R2 machine, specifically the "Allow connections only from computers running Remote Desktop with Network Level Authentication (more secure)".  I have my virtual machines set to use the NLA option, but that hasn't been a problem in the past--I've been connecting to the servers fine from my desktop and laptop.

After further testing, I realized that I only had the problem when I attempted to connect using my laptop.  My desktop still connected fine.  Both systems run Windows 7, so both should be able to connect with NLA.

After poking around on my laptop, I found this clue in the About menu of the Remote Desktop Connection app on my laptop.


Note the message "Network Level Authentication not supported".  On my desktop, it says NLA is supported.  So it seems that something changed or broke on my laptop, disabling NLA.

After a few more rounds of searching, I finally came across this TechNet forum thread where user "Millerus" graciously posted the solution that happened to work for me.

The resolution is based on instructions for enabling NLA on Windows XP, which were listed here:

http://www.powercram.com/2009/07/enabling-network-level-authentication.html#

UPDATE: Apparently the URL for the blog post has changed to:

http://blog.powercram.com/2009/07/enabling-network-level-authentication.html


I had previously ignored all Windows XP related information, since I didn't think it would be relevant to Windows 7.  But it appears that something removed or modified a registry entry, disabling NLA on my laptop.

In my case, when I checked the HKEY_LOCAL_MACHINE\SYSTEM\ CurrentControlSet\Control\SecurityProviders key in my registry, it was missing the credssp.dll file in the list.  Once I added that DLL to the security providers list, my Remote Desktop Connection app suddenly showed that NLA was supported.  I did not have to reboot my laptop--I just had to close and restart the Remote Desktop Connection app.


Finally.  Another hour wasted on some trivial issue.  Now back to work...

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter






AP Payment Application Import for Dynamics GP

One limitation of both Integration Manager and eConnect is that those import tools do not allow you to import an AP Payment Application into Dynamics GP.  You can import an AP Manual Payment, but you cannot apply that AP payment to an open AP invoice.

A few years ago a client needed to import a large batch of AP manual payments and apply them to open AP invoices, so I developed a custom import that would apply the AP payments.  Since then, I've been in touch with several more GP users that need to import AP payment applications, so I now offer the import as a product.

The AP Payment and Apply Import can import AP Manual Payments and AP Payment Applications.  The application imports payments from one data file, and applications from a second data file.  Payments and applications can be imported together at the same time, or separately.

Here is a brief demo video of the import application:




The only caveat is that the AP Manual Payments must be unposted for the application to apply them to an open AP Invoice--the application cannot currently apply a posted payment to an open invoice.

If you have additional questions about the AP Payment and Apply Import or are interested in a trial, please feel free to contact me at:

http://precipioservices.com/contact-us/



Written By Steve Endow

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter