Thursday, December 22, 2011

Management Reporter Size Limitation

Over the past couple of weeks, I have been working with a client on a series of odd Management Reporter issues.  The first issue they reported happened when they were working with trees.  The user would add a branch to a tree in MR, and then enter a series of dimension codes for that branch.  Randomly, it seemed, the dimensions code would not save on some branches of the tree.  Odd.  The tree branch/unit would be there, but the dimension field would be blank.

So, we spent some time one afternoon trying to figure out some sort of common thread with the disappearing dimensions.  It seemed like different combinations of dimensions would cause the issue.  After talking with Microsoft, it was determined that the issue is a 512 character limitation on the dimension field in the tree.  So, when you add multiple dimensions to a tree branch/unit, it may add up to 512 pretty quickly since MR includes the dimension name for each dimension you include.

I know that this may be working as designed, but this seems like something that many users would encounter over time-- particularly those with large/long account strings.

As a workaround, I suggested that the client used dimension sets on the branches of the tree instead.  And this worked great, except that there also appears to be a size limitation on the dimension set as well.  So if the user tries to add too many dimension combinations (apparently also variable based on the dimension name and other characters included in the string) it also causes a series of errors.  In this client's case, they would get one of the following errors:

  • Object is currently in use elsewhere
  • Column ‘ParentAccountSetID, LinkIndex’ is constrained to be unique.  Value ‘8e1f4448-f5c1-461a-b898-f7c7cc572bf2,23’ is already present
In both cases, it would force Management Reporter to close and the building block would then be checked out to the user.  The second error message above is possibly a quality report (per Microsoft) and is being analyzed, it does appear to be caused by some sort of limitation on the size of dimension set.

I know that both of these issues may seem to minor for a lot of clients.  But as I have recently been working on MR reports for a Dynamics AX client with a large account string and long chart of accounts, I do worry that we might start to hit up against these sort of issues more frequently.  The workarounds that seem to be available for now include:
  • Keeping dimension sets small
  • Using dimension sets instead of including large lists of dimensions on a tree
  • Using ranges and/or wildcards wherever possible to decrease the number of characters used
  • Use shorter names for your segments, so that they take up less space when repeated in a dimension list
Will keep you posted as I hear back on the status of both of these items.  And, yes, I still love Management Reporter :)

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, December 16, 2011

Helpful Microsoft Forecaster Scripts

I know I have been quite absent from the blogosphere lately, but I hope that Steve has been keeping everyone entertained (or at least keeping you in deep thought over complex technical issues).  I wish I could say that I was on a boat in the middle of the ocean with not a care in the world, or sitting on a beach somewhere with a frosty beverage in hand.  But, alas, that is not the case.  Sadly, my father passed away last week after an 8-month battle with pancreatic cancer.  Cancer sucks.

But, on to something helpful regarding Forecaster.  When working with a large chart of accounts, it can be helpful to use SQL scripts to compare your line sets with your segment setup, or check to see if every department has a line set assigned as an override.  Here are a few scripts I have used to help with these sorts of reviews.

--Identify the assignable segment values (Setup>>Segments) that do not have an override line set defined (Setup-Budgets-Input Set-Overrides tab), does not include assignable segments defined as summary departments.  Note that depending on the position of your assignable segment in your segment definition, you may need to modify this to use a different M_SEG table (e.g., M_SEG2).

select M_SEG1.ID, M_SEG1.Label from M_SEG1 where M_SEG1.ID not in (Select Z_INP_SEG2.SEG1 from Z_INP_SEG2) and M_SEG1.LN=-1

--Identify accounts in line sets (Build-Lines) that are not set up in the master (Setup-Segments).  This is helpful when you have created line sets by copying and pasting, rather than using the wizard and/or inserting from segments. Note that depending on the position of your main/reversable segment in your segment definition, you may need to modify this to use a different M_SEG table.

select Z_LIN.LINE, Z_LIN.CODE from Z_LIN where Z_LIN.CODE not in (select M_SEG0.ID from M_SEG0)

--Here is the opposite of the script above, this one identifies accounts that are in the account master (Setup-Segments) but doesn't exist in a line set (Build-Lines) to ensure you caught everything.  Same note about the M_SEG table applies here as well.

select M_SEG0.ID, M_SEG0.LABEL from M_SEG0 where M_SEG0.ID not in (select Z_LIN.CODE from Z_LIN)

I am looking forward to getting back in to the swing of things here on the blog, and I hope everyone is enjoying their holiday season!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Monday, December 12, 2011

Decisions Fall 2011 On Demand Presentations Now Available

In case you missed the excellent MS Dynamics World Decisions Fall 2011 virtual conference, all of the presentations are now available for replay on demand:

There are 12 Dynamics GP presentations available, plus presentations for AX, NAV, and CRM!

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Thursday, December 1, 2011

GP Developer Tip: List All Parameters for a SQL Server Stored Procedure

Occasionally I work on a Dynamics GP integration where I have to call a Dynamics GP stored procedure from a .NET application.  As always, there are several ways to do this, but I typically use one of two approaches.

In situations where the procedure returns one or more values in OUTPUT parameters, I usually like to call the stored procedure directly in a procedure (rather than rely on a separate call to a data access class).  In this case, I use the very, very cool DeriveParameters method:

gpCommand = sqlConn.CreateCommand();
gpCommand.CommandType = CommandType.StoredProcedure;
gpCommand.CommandText = "zDP_PM10200SI";

//Get all of the parameters from the stored procedureSystem.Data.SqlClient.SqlCommandBuilder.DeriveParameters(gpCommand);

When you call DeriveParameters, .NET interrogates the stored procedure and builds a list of all of the parameter objects for you.  This saves ALOT of typing, and ensures that all of the parameters are properly declared.

On the other hand, when I don't need to get return values from a stored procedure, I use my data access class, which requires an array of SqlParameter objects that includes the parameter name and data type.  In this case, I have to assemble the array and assign the SqlParameters.

SqlParameter[] sqlParameters = new SqlParameter[29];
sqlParameters[0] = new SqlParameter("@EMPLOYID", System.Data.SqlDbType.VarChar, 15);
sqlParameters[1] = new SqlParameter("@ID_Accrual_Code", System.Data.SqlDbType.VarChar, 11);
sqlParameters[2] = new SqlParameter("@ID_Accrual_Code_2", System.Data.SqlDbType.VarChar, 11);

In this case I need the parameter name and data type.  Some stored procedures are simple and only have a few parameters.  But there are a few that have dozens of parameters, which makes this type of coding a tedious chore.

So make this process easier, I wanted to get a list of all of the stored procedure parameters.

It turns out that this is pretty easy--just use sp_help:

sp_help zDP_PM10200SI

This provides a very nice parameter listing, with data types, that can be copied and pasted into Excel or text editor, and then easily converted into code.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Monday, November 21, 2011

Importing 10 Million Transactions Into Dynamics GP: At the Decisions Fall 2011 Virtual Conference

The Decisions Fall 2011 Conference is in two weeks, with the Dynamics GP day on Tuesday, December 6!

Registration is free and only takes a minute.  This is a virtual conference, so no travel, no hotel, no convention centers--the entire conference can be accessed using your web browser.

I'll be giving a presentation at the conference titled "Importing 10 Million Transactions Into Dynamics GP: Lessons Learned".  It is based on a large project that I worked on over the last few years where I've gained a new perspective on what matters most for an environment with very high volume, complex integrations. 

In addition to great presentations, there will be several vendors in the virtual conference hall presenting their enhancements for Microsoft Dynamics GP.  I will be at the Envisage Software Solutions booth showing customers the very popular Post Master add-on for GP, which fully automates the batch posting process, as well as the very powerful Search Master product, which allows you to instantly find any data in any of your Dynamics GP databases.

Please take a minute to register and devote a few hours of that Tuesday to learn more about Dynamics GP!

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Friday, November 18, 2011

My Favorite Payroll Support Articles

It seems like the vast majority of payroll issues are related to taxability, which in turn is related to the setup of the payroll module.  So I thought I would share a couple of my favorite payroll support articles related to taxability. 

How to correct overwithholding of payroll taxes

I have used this article time and time again.  It can be broken down in to three key steps:
  1. Refund the overwithheld taxes
  2. Correct the tax summary information for total taxes and taxable wages
  3. Adjust the pay code used to pay back the taxes
Correcting differences in federal wages on the 941 -vs- Payroll Summary report

This article explains how both reports calculate, and the subtle differences between them.  The key is that the 941 report uses the current tax status of deductions to recalculate taxable wages while the payroll summary relies on the taxable wages calculated at the time the payroll was posted.

To minimize tax issues in payroll in GP, you should (in my humble opinion)...
  • Ensure that the payroll setup tax flags for pay codes, deductions, and benefits are set up correctly from the start (Setup-Payroll-Deduction, Benefit, Pay Code)
  • Resist the urge to change tax flags at the individual employee level (Cards-Payroll-Pay Code, Deduction, Benefit)---simply because it becomes more complicated in terms of the variations in taxability that could exist for a single code
  • Ensure that all pay codes that are marked as SUTA taxable also have a SUTA state specified (this is a required field when manually setting up a pay code, but the behavior can vary when rolling down pay code assignments)
  • Ensure that a default state tax code is specified for the employee, this is DIFFERENT than setting the employee up for a state tax (Cards-Payroll-State Tax).  You need to go to Cards-Payroll-Tax and make sure a default state tax code is specified for transaction entry.
  • Print your 941 after every payroll when you first go live, and validate the results.
Have a great Thanksgiving!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Tuesday, November 15, 2011

Consultant Tools Series: Travel Power Outlets

Nearly every hotel I've stayed at the last several years has had one limitation:  Not enough power outlets.  There may be one outlet available on the desk, or two if I'm lucky, but sadly, that isn't enough to handle the laptop, cell phone, bluetooth headset, iPad, and hotspot.  I have often had to charge my cell phone using a bathroom power outlet and leave my laptop powered on all night so that I can charge something else via a USB port.

I was poking around the electronics section of one of those massive discount retail stores a few months ago and I came across a gadget that looked pretty interesting.  It is a mini power-strip of sorts that provides three full outlets, but what is unique is that it also includes two USB charging ports.

I brought it home, and to break it in, I tried it on my overburdened kitchen counter power outlet where we have a pile of iPhones, an iPod, and iPod Touch, and a few other devices.  It worked perfectly, allowing us to use two USB cables to charge phones, freeing up two outlets.

This week I had a chance to try it on the road.  The desk in my hotel room has two outlets in a recessed compartment that would have made it difficult to use two of my chargers, so the power strip worked like a charm.  I was able to charge my iPhone and hotspot via USB, and then plugged in my laptop and bluetooth headset in the power outlets. 

Since I have a MacBook Air with only 2 USB ports, one of which is always used by my Logitech wireless mouse receiver, my one remaining USB port is pretty valuable, and can now remain available.

The only minor downside is that the power strip is a bit bulky, but given it's value, I don't mind carrying it in my luggage.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Thursday, November 10, 2011

One Massive Flaw in SugarSync: It is not a backup solution...

I've been a huge fan of SugarSync since I started using it several years ago.  It will automatically backup files on my desktop and synchronize them with my laptop, and vice versa.  It even works with my mobile phone so that I can easily access frequently used files when I'm away from a computer.  It also let's me share files or entire folders with other people.  It's not perfect, but it's been nearly perfect for me.  But today I discovered an interesting flaw.  Admittedly, this is probably a potentially unusual situation, and I'm waiting to hear back from SugarSync Support on whether this is a bug or whether it was a highly unusual fluke, but it definitely got my attention.

So with that out of the way...

My desktop machine has been misbehaving lately, primarily in the form of Blue Screens.  After further digging, it seems that there is some type of issue with one or more hard drives or the motherboard.  To diagnose the issue, I unplugged all drives except the C: drive.  I tried booting with just the C: drive, and although Windows will load and work for a while, I am still getting the blue screens.

Well, a funny thing happened while I was doing those tests.  Well, maybe not so funny.

On my desktop, I have a dedicated C: drive with only Windows and Program files.  I store all of my user data and files on a D: drive.  That way, if I ever need to reinstall Windows, I can just wipe the C: drive and not worry about losing any data.  And naturally, I have SugarSync backup all of my files on the D: drive and synchronize them to my laptop. 

It seems that when I disconnected the D: drive on my desktop, SugarSync decided that I had deleted all of my files.  And I mean ALL OF MY FILES.  Apparently since it could no longer see the D: drive, it sent messages up to the SugarSync servers telling the Mother Ship that ALL OF MY FILES were deleted.

And then what happened?  Well, when I fired up my laptop, SugarSync on my laptop dutifully downloaded all of the synchronization commands to delete ALL OF MY FILES from my laptop.

At the time, I was looking for some files on my laptop and noticed that one directory was missing.  I checked my backups on my local server to confirm the files existed, and by the time I switched back to my laptop directory listing, everything was gone.  Tens of thousands of files and thousands of directories were wiped from my laptop.

I just stared at the screen in disbelief, thinking that Windows Explorer wasn't refreshing or that I was going blind.  But nope, everything had been deleted.  In a panic, it took me about 30 seconds to realize what had happened.

Initially, the first level SugarSync support rep quickly claimed that when SugarSync detects that a drive is missing, it will automatically "disconnect" the synchronized folders, whatever that means.  But clearly that didn't occur in my case, and the SugarSync web site shows all of my files have been deleted.  I am now waiting for them to review my log files and see if they can figure out what happened.  The support rep foolishly closed the chat session with "...please remember that you should not disconnect a drive on which the SugarSync folders are present", which is a preposterous statement, especially since he initially claimed that SugarSync would "disconnect" the backed up folders if a drive was not found.

And thinking about the obvious real world situations, what if you want to backup data on an external USB drive?  Does SugarSync suddenly delete files everywhere when you disconnect that drive?  What if my D: drive had died?  Will my laptop be wiped out when that happens too?  It doesn't make any sense that the files should be deleted when a drive is no longer detected.

Since I rely heavily on SugarSync (or at least I used to before this happened), I apparently need to test all of these scenarios to assess the damage that may occur.

Fortunately, I am only highly annoyed by all of this, primarily because of the time it has wasted and will continue to waste until I get a resolution.  I'm not going insane for a few reasons:

1. It appears that all of my files are still present on the SugarSync servers, but are marked as deleted.  So I'm guessing they should be able to revive them, assuming their second level support is more competent their their first level reps.  I could double click on the deleted files to restore them myself, but I am waiting for them to figure out what happened before I touch anything.

2. My D: drive on my desktop did not die, I just had to unplug it, so my files are still intact on that drive.  But I am now very wary of plugging the drive back in, should SugarSync decide that it needs to wipe that drive as well.

3. I also use Carbonite to backup all of the files handled by SugarSync, plus many, many more, so I have another copy on Carbonite's servers.  Apparently Carbonite does not have the same flaw as SugarSync and does not immediately delete my files when the drive is not connected.  But Carbonite sometimes gets back logged with my photos and other large files I backup, so files that I change regularly may be several days old on the Carbonite server.

4. Every evening a scheduled task runs on my desktop that uses RoboCopy to backup my files from my desktop to my file server.   Unfortunately because of the issues with my desktop, it looks like it has been a few days since that ran successfully.   So I do have another copy of everything, but several files will be a few days old.

So this has been a good lesson about an ironic downside to a seemingly fantastic backup solution.  And it's been a good, albeit unwanted, test of my neurotic multi-layered backup strategy.  It seems to work, but like most things, it isn't perfect.  I now have some clear validation that you can't have too many backups...

UPDATE 1:  I have since had several rounds of discussions with the very mediocre support at SugarSync, and they have essentially confirmed that SugarSync is designed to delete all of your files on all of your synchronized computers if a hard drive is no longer detected or a drive fails.  They have effectively said that SugarSync isn't a backup solution--it's a synchronization solution, and if a non-system drive fails on any of your synchronized computers, then SugarSync is supposed to consider all of the files on that drive deleted, and that all of those files should therefore be deleted on every other computer that is synchronized.  Of course this is preposterous, as I think anyone would agree there is an obvious difference between a hard drive disappearing on a system and a file being deleted.  Heaven forbid if you change the drive letters on a computer--I assume that would cause SugarSync to delete everything as well.

All of the deleted files are technically available to recover via their web site, but you have to recover each sub-directory individually.  There is no way to recover a directory and all of its sub-directories.  For serious users with thousands or even just hundreds of directories, this is a nightmare.

UPDATE 2:  Since I now know that I can't rely on SugarSync to safeguard my data, I now use Acronis TrueImage to make a full image backup of my D: drive, along with the existing full image backup of my C: drive.  Because you can't have too many backups!

UPDATE 3:  A reader, Loren M., saw my post and appears to have experienced the same hassle I did, so it seems that this is not a random issue.  And as he points out, the irony is that a casual / non-critical SugarySync user would probably never experience this issue--only the serious power user with multiple hard drives and multiple synchronized computers would encounter this issue.


It is interesting you would note this.  Unfortunately I can conclusively confirm that SugarSync has not fixed the problem since you reported it, since I fell victim to the same fate with terrifyingly similar results.  I also discovered a few additional tidbits of information… To go through the reason you mentioned point by point and why I think people should still be worried:
1.       First, take no comfort in the idea that SugarSync allows you to recover your deleted items.  I also had thousands of directories and subdirectories on my system.  The initial problem in trying to recover these files is that each and every one of these subdirectories must be recovered individually – there is no way to select one directory and all the subdirectories below it, each subdirectory (and underlying subdirectory) must be opened, and all the files selected, and a restore stared.  This alone may take the better part of one’s life to work through.  However even those willing to suffer through this will be disappointed with the results.  I found major sections of my file structure were no longer recoverable at all – the subdirectories simply did not exist on the SugarSync server. They were just gone.

2.       In my situation, the removed drive was only disconnected for an hour or so.  It automatically remounted itself, by that time it came back online SugarSync had already decided to delete all the synchronized files from my three other computers.  When the offline drive came back, it then “synchronized” with the deleted computers and also deleted everything.  If the reason you or anyone is using SugarSync is to have a “backup” copy, you should stop using SugarSync right now even if you have paid for it.  Once you have SugarSync up and running, it is the opposite of a backup – loss of your document drive on any of your synchronized computers will ensure it will be deleted from each and every computer you are syncing to.  This is the worst possible “fail deadly” configuration imaginable for “backup” system, yet it is exactly how SugarSync operates and this can be proven by repeatable testing.

3.       The only possible way to save yourself from the horrors of SugarSync is to backup all of your data somewhere else.  I happened to be using Crashplan, which is the only way I got my data back.  If anyone out there is using SugarSync – and I can’t stress this enough – a separate backup tool must also be used which does not rely on SugarSync to protect data.   Keep in mind however that if the backup tool is running in the background or on some schedule, there may well come a time when SugarSync has deleted all of the data from the drive being backed up – recovery of deleted files from the backup service is the only possible route at this point, but this too can be problematic since the backup service has no way of discerning what was intentionally deleted vs. what SugarSync deleted when it went crazy. 

4.       In a scenario eerily similar to Steve, I was using Allwaysync to synchronize all my files to my file server.  Unfortunately I had been setup to “synchronize” instead of copy, and since deletions were synchronized this copy was promptly deleted as well.  I have since switched to a “copy” mode, however it is less than ideal, since it means nothing I legitimately delete will ever be deleted from the server, and I’ll have to manage this additional copy manually just to make sure I don’t suffer the “SugarSync Suicide” again.
My takeaway from this whole experience was that SugarSync is not a backup solution – it is a synchronization tool with a nasty penchant for destruction of file systems.  To anyone using SugarSync - you need a separate backup solution (and a very good one) to protect yourself from what SugarSync will eventually do to you if your synchronized drive goes bad.

Another insidious aspect of this problem is that “casual” users of SugarSync or those who are simply using the free few GB as a trial before purchasing more storage are highly unlikely to ever experience a problem.  However paying customers who are more often than not synchronizing entire documents devices (on a separate drive) will almost certainly be hit by this problem at some point, and it will be when they are most vulnerable -- after a drive failure.

Good luck, and let’s hope SugarSync addresses these problems soon!

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Tuesday, November 8, 2011

Dynamics GP 12 Web Client Coolness

There was a lot of talk at GPPC about the new web client for Dynamics GP 12.  It really got me to thinking about how we might incorporate it in to our deployment strategies, and which customers (and prospects) might benefit from it.  Of course, there are those that ask for it directly (I see it on requirements lists constantly) but it still leaves me wondering, practically, how the web client will be used.

I will not pretend to know how to explain the architecture and technology that goes in to delivering the web client, but I can appreciate the effort and complexity of what they have accomplished with it.  And here are the practical tidbits I have taken away from the keynote and sessions I have attended here in Vegas:
  • The web client represents a move to a 3 tier architecture-- a presentation layer, an application/logic layer, and the data layer.  From a non-technical perspective, I think the key concept is that the user interface (UI) has been separated from the application/logic layer.  So we can have these windows in the web client that use the same application/logic as the hard client, without the need to replicate/duplicate all of the logic contained in Dynamics GP today. Not the most technical answer, I know, but I have to distill these things down to the basics for me :)
  • The web client windows will include ribbons across the top to initiate actions, and hopefully will also contain sub-windows on separate tabs.  And, just like the hard client, you will be able to have multiple windows open at the same time in the web client (woohoo!).
  • Windows will be dynamically generated using Silverlight, which means all of your dexterity customizations and third party products will be availab, including macro capability.  The only hitch in this right now is VBA, which will not transfer to the web client.  Microsoft is working on some ideas on how to address this, due to the technical complexity.
  • I asked specifically about Word Templates and emailing of documents, as these are two items that sometimes complicate terminal server deployments (since Word and Outlook have to be on the terminal server, in the same environment as GP).  These are also issues that are still be worked out, and there may be some assumptions about Outlook and/or Word on client machines that are using the web client (which seems reasonable to me).
As far as technology goes, if you plan on supporting and deployment the web client, these are the technologies that Microsoft recommends that you familiarize yourself with:
  • IIS
  • Dexterity
  • Silverlight
  • Web Services
  • Internet Explorer (including security)
  • XML
  • Visual Studio (potentially a tool for modifying the templates used to generate the web client windows)
And, of course, you will need to think about how the web client will be used (for your clients, your prospects, or for yourself)  and what processes in GP will be initiated via the web client.

Well, that is my totally non-technical post on the web client.  I personally am very excited by the potential, and can't wait to see more.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

GP12 Update From The Partner Connections Conference

The past three days have reminded me of just exactly how old I am.  Vegas wears me OUT!  And I am not very good at gambling, either. I have been learning and mingling at the GPPC Partner Connections conference in Las Vegas since Saturday.  I presented a couple of sessions on Sure Step, and have sat in on sessions covering GP12, the new web client, demo tools, and the fabulous SQL report builder.  Very informative and fun few days.

I thought I would share a few of the most enhancements I heard about in GP12, exciting stuff coming!!
  • Business Analyzer can appear on the home page (Cool!)
  • Customizable area pages (Also cool!)
  • Select printer at the time of printing (oh, my, so super cool that it garnered applause...long time coming)
  • Reprint PM check remittance (much-requested)
  • Print SQL reports from a Dynamics GP form (I can't help but wonder if this is a precursor for SQL reports to be able to replace a Dynamics GP report similar to Word templates)
  • Subledger reconciliation for Bank Rec to GL (also much-requested)
  • Prepayments on purchase orders (yay!)
  • Bank rec void indicator, so if you try to void a check that is reconciled you will get a message (will help so many customers avoid this issue!)
  • BAI format for EFT
  • Document attachment (not quite document management, but it is a cool start...with the documents stored in the SQL server)
  • PO tolerance handling (I get so many questions about this, so I am very excited!)
Also, big changes coming in Fixed Assets, it sounds like a real overhaul including:
  • Historical depreciation reporting
  • Mass deprecation reversal
So, when will we have a chance to see these enhancements in action?
  • Feb 2012- TAP program
  • Mar 2012- Convergence
  • July 2012- Beta program
  • Sept 2012- GP Technical Airlift
  • Oct/Nov 2012- Launch activities
Also, I should mention one of the biggest enhancements- the new GP web client!!  Will post more on that later.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Saturday, November 5, 2011

Simplifying Your Passwords

A few months ago, I came across three great items regarding passwords.

The first is an excellent comic on  It helps to debunk a common misinterpretation about passwords:  that passwords must be "complex" in order to be effective.  Or perhaps more accurately, it reframes the concept of "complexity" with regard to passwords.

It makes the great distinction that "hard to remember" (for humans) and "hard to guess" (for computers) are two very different things, and demonstrates that it is possible to have a password that is easy for you to remember, yet very secure against brute force attacks performed by a computer.

It's an excellent explanation in comic form, and great lesson about how to think differently about passwords.

The second resource is an informative article and tool by Steve Gibson at

Steve's "password haystack" concept is insightful, and is very similar to the XKCD lesson.  Steve's calculations with "search space" are different than XKCD's calculation with entropy, but Steve explains that in terms of brute force password guessing (versus attacking the underlying encryption algorithm or keys), it's the search space that matters, not entropy. 

And the key lesson is that increasing the search space is MUCH easier than increasing the entropy. 

He provides a nice demonstration comparing two sample passwords:



Which one do you think is more "secure"?

Which one is easy to remember and type?

The first password, "D0g", followed by a bunch of periods, theoretically wins on both counts.  As he points out, the first one may have much less entropy, but when it comes to brute force password crackers, the only aspect of entropy that matters is making sure that you are using at least one character from each "type": uppercase letters, lowercase letters, numbers, and symbols.  Once you have at least one of each of those (preferrably more than just 4 characters), you can then start using padding to dramatically increase your search space.

The third item is a comment that a friend made when I discussed this topic with him.  He works alot with IT security, and he pointed out that "password" is semantically flawed.  We should refer to them as "pass-phrases".   If we can transition away from the idea of using a single "word", to phrases that can contain multiple words, it should increase the search space, and also increase the ease of memorization.

Together, I think these provide a great basis for how we should start thinking about passwords, and how they should educate users about passwords.

Users hate passwords like "dqkGx^D,c=41S5a", but something like "Fargo Is #1!!!" can be memorized very quickly, and can be recalled very easily.

So, having learned all of this, how do I use it?

I have been using RoboForm for securely managing all of my passwords, so in theory, I only have to remember one "master password" for RoboForm.  I can then let RoboForm use high entropy, difficult to remember passwords, like "dqkGx^D,c=41S5a" for my various web site logins.  But in the rare occasions when I have to manually login to a site without RoboForm, those cryptic passwords are a hassle, so I may just convert most of my passwords to "haystack" style passwords.

Unfortunately, there are probably some applications or web sites that will make it difficult to use these passwords, such as ones that may limit password length, and others that require combinations of passwords and PINs.  And there are quite a few "random password" generators that are widely used (including the one in RoboForm) that don't support this methodology, so you will need to come up with your own technique for generating the pass-phrases and using padding.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Thursday, November 3, 2011

A Tale of Two BSODs: Diagnosing Windows Blue Screen of Death

Ever had a Windows machine display the Blue Screen of Death?  Through amazing coincidence, blue screens showed up on both my desktop machine and a client's production Dynamics GP Terminal Server in the same week!  I got fed up with the cryptic errors and finally decided to learn how to diagnose the infamous BSOD.

A few months ago I built a new desktop machine. Although there were a few quirks with 64-bit Windows 7, it seemed to work well.  Until the day when I started to get the dreaded Blue Screen error.

Having dealt with blue screens occasionally over the years, my general interpretation is that once you start getting them on a machine, they don't tend to go away on their own.  Sure enough, my desktop started to blue screen a few times a week.

While at my desk, I saw the blue screen occur and flash on my monitors, but my computer instantly rebooted, preventing me from seeing the message.  By default, Windows 7 and Server 2008 are set to automatically restart when a "System failure" occurs.  This option is set under System Properties -> Startup and Recovery Settings.  The first change I made was to disable the automatic restart option so that I could know when the blue screen occured and see the error messages.

Sure enough, the blue screens showed up again a few days later, but unfortunately, the message displayed wasn't very helpful.

Sometimes you will get lucky and see a specific driver listed, like "ETRON_USB_3", which can tell you immediately that a third party USB 3 driver is causing the problem.

But in my case, since a specific driver wasn't listed on the blue screen, just a cryptic "STOP" error, I didn't have any clues as to a possible cause.  I figured that my only option would be to try and reinstall Windows, which isn't on my favorite-things-to-do list.  So I put it off and just ignored the occasional crash, knowing I would eventually have to deal with it.

Then, the other evening, while connected remotely to a client's server, I was suddenly disconnected.  When I was able to reconnect, I saw a message indicating that the server had experienced a blue screen and had restarted automatically.

Figuring that two systems with blue screens in the same week was too much of a coincidence, I took it as a challenge to learn how to diagnose the cause of the dreaded BSOD.

To my surprise, it turns out that it is shockingly simple to get diagnostic information about the BSOD error--if you know what tools to use and once you know how to use them. 

In the Startup and Recovery options in Windows, there is an option to "Write debugging information".  In the latest versions of windows, the default setting is to write a "Small memory dump", also known as a "minidump". 

When Windows encounters a "system error", it writes certain diagnostic information to this memory dump file explaining the specific area of the operating system that caused the crash and possible causes of the problem.

I naively thought that reading memory dumps were some type of complex process that only Wizards at MS Support could perform, but to my surprise, there are several tools available to make the diagnostic process extremely easy.

I found this blog post by the famous Mark Russinovich, which got me started on the "old school" method of reading the debug files using the Microsoft WinDbg utility.

There are a few challenges with this approach.  First, you have to figure out which version of WinDbg you need for your OS, and Microsoft seems to want to make it as difficult as possible to get just that one tool.  You either need to download the Windows Driver Development Kit (MSDN subscription and login required), or you have to download the Windows SDK just to get one little EXE file.  It's absurd.  You then have to try and figure out the extremely arcane tool, since it is obviously not designed to be a polished consumer-friendly product.

Anyway, I jumped through all of these hoops, installed WinDbg, and read my minidump files.  Immediately, the tool showed me the cause of the blue screens: 

Probably caused by : memory_corruption

Wow.  With a few clicks, I was able to determine the cause of a blue screen!  It seemed like magic.

So I ran MemTest on my workstation, and sure enough, it instantly showed memory errors.  Since I know just enough to be dangerous when it comes to these things, I figured that it was possible that my memory was physically fine, but that there was something else that was causing the issue.

I booted into the BIOS settings and disabled the XMP memory profile, which has the memory operate at a faster speed.  Sure enough, once I saved that setting and ran MemTest again, no errors.  I tried changing various settings with XMP enabled to see if I could get XMP working, but I couldn't get rid of the errors, so for now I'm running sans-XMP, which is fine for the relatively simple tasks that I perform on my desktop.

Feeling very confident after conquering my first BSOD in a matter of minutes, I then decided to diagnose the BSOD on the client's production Dynamics GP Terminal Server.

I launched WinDbg, set the debug symbol path, and loaded the minidump, and well, unfortunately the results weren't quite as simple as my workstation.

Probably caused by : Ntfs.sys ( Ntfs!NtfsDeleteFile+8d3

So what does this mean?  My interpretation is that something about a file delete operation caused the server to crash.  Later on, there is a reference to iexplore.exe, which is Internet Explorer:

PROCESS_NAME:  iexplore.exe
ERROR_CODE: (NTSTATUS) 0xc0000005 - The instruction at 0x%08lx referenced memory at 0x%08lx. The memory could not be %s.

This is where an expert would be required, since this just doesn't seem to be enough information to determine a specific cause.  My only interpretation is that some aspect of Internet Explorer is somehow causing the crash. 

So for now, no magic solutions like what I found with my workstation, but we at least have some information that we can use to monitor the server.

Having gone through the process, here's a summary of what I learned:

1. It seems that WinDbg comes in two flavors, listed on this MSDN Dev Center web page.

The newer version (6.2.8102 8/23/2011) is included with the Windows Developer Preview WDK (MSDN subscriber only), which seems to work fine for Windows 7 minidump files (and perhaps Server 2008 R2).  But when I tried to use this version on the client's Windows Server 2008 (not R2), it spewed a bunch of complaints about being unable to load ntoskrnl.exe. 

So, for Windows 2008 and versions prior to Windows 7, there is a second version ( 2/1/2010), available in the Windows SDK.

When you install the Windows SDK, you only need the Debugging Tools, and can uncheck all of the other options.

2. In order to properly read the dump files, you need to first set the symbols path.  The Russinovich blog post mentions one (a), but I found a different one on a forum thread that seemed to work better for me (b).

a)  asrv*c:\symbols*

b)  SRV*C:\WebSymb*

Version (a) worked on my Windows 7 machine with the newer WinDbg, but I had to use bersion (b) with the older WinDbg.

3. To perform the debugging, launch WinDbg, select File -> Symbol File Path and paste in one of the symbol paths from above.  Then select File -> Open Crash Dump and select your minidump file.

4. Wait a few seconds for WinDbg to analyze the file and display results.  Hopefully you see something like the following, including the helpful "Probably caused by" note:

*  Bugcheck Analysis  *
Use !analyze -v to get detailed debugging information.
BugCheck 24, {1904aa, c941b6a0, c941b39c, 9247f5fc}
Probably caused by : Ntfs.sys ( Ntfs!NtfsDeleteFile+8d3 )

5. You should see a link on the text "!analyze -v". If you click on that link, it will display more information that may help you further diagnose the problem.  It's all pretty cryptic looking, but a technical person or developer should be able to pick out a few clues.

6. There are apparently other tools that are much easier to use than WinDbg, but may not be as comprehensive.  I quickly tried one called BlueScreenView that is amazingly simple and easy to use.  The only downside is that it doesn't appear to offer the "Probably caused by" note provided by WinDbg.  Once you get familiar with the typical errors, you may not need that helpful message, but I still need that pointer, so for now I'll stick with WinDbg.

My experience has been that blue screen errors are pretty rare these days, but obviously they do still occur.  If you are feeling adventurous, hopefully this information helps you navigate the relatively simple process of doing some initial diagnostics on your own before rebuilding a server or paying for a support case.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Thursday, October 27, 2011

TXSHANTY and Tax Sheltered Deductions

Whew! Back from vacation and a whirlwind week before and after!

I have been battling a bit of an odd issue for a client regarding FUTA and SUTA taxable wages.  Let me start by giving you some background:

1. Deduction configured as non-TSA, no tax sheltered boxes marked (Setup-Payroll-Deduction, Cards-Payroll-Deduction)

2. Activity posted using the deduction code

3.  Deduction setup changed to now be TSA, with three of the tax sheltered boxes marked (Setup-Payroll-Deduction) and change to set up rolled down

4.  Deduction maintenance not updated with new tax sheltered settings, as their was already activity on the code (Cards-Payroll-Deduction)

So, the bottom line is that the employee deduction maintenance (Cards-Payroll-Deduction) showed no tax sheltered checkboxes marked but the setup did (Setup-Payroll-Deduction).  Normally, this would not be a problem.  New employees would get the new settings while older employees would not.  But then the weirdness began :)

When printing the FUTA and SUTA summary reports, it was noticed that the wages were off by the amount of these deductions.  Both FUTA and SUTA were set  up (Setup-Payroll-Unemployment Tax) to NOT include any TSA deductions as wages.  But, in this case, the deduction in question was NOT sheltered.  So it should NOT have been excluded from wages. 

If we changed the FUTA and SUTA setup to include the deduction as wages, it fixed the employees who did NOT have tax sheltered flags marked.  But it created another problem, since it also now included the deduction as wages for employees who DID have the tax shelted flags marked.  Eek.  Fun.

So why was it sheltered the deductions that were NOT marked as tax sheltered?  All 'cuz of TXSHANTY.  Many thanks to Michelle Blaser at Microsoft, who asked the question and had us check this setting.  In the UPR00500 field there is a field called TXSHANTY.  This is a boolean field, with a zero if NONE of the tax sheltered checkboxes are marked (SHFRFEDTX, SHFRFICA, SHFRSTTX, SHFRLCLTX) or a 1 id any of the tax sheltered checkboxes are marked.

In the case of these employees that had deductions that were NOT marked as tax sheltered, the TXSHANTY setting was 1 although the tax sheltered checkboxes were all zeroes  (SHFRFEDTX, SHFRFICA, SHFRSTTX, SHFRLCLTX).  How could this happen?

Well, with a little testing with Microsoft we found that the rolldown from the setup did not rolldown the tax sheltered checkboxes (SHFRFEDTX, SHFRFICA, SHFRSTTX, SHFRLCLTX) because there was activity but it was still updating the TXSHANTY field behind the scenes.  The interesting part of this is that there is not logical situation where TXSHANTY would be 1 while all of the other checkboxes are zero.

So we used the following scripts to identify and correct the affected records:

--Run select statement to verify number of records to be affected
select * from UPR00500 where TXSHANTY=1 and SFRFEDTX=0 and SHFRFICA=0 and SHFRSTTX=0 and SFRLCLTX=0
--Run update statement to set Tax Sheltered Annuity field to 0 where are all Tax sheltered tax fields are 0
update UPR00500 set TXSHANTY=0 where TXSHANTY=1 and SFRFEDTX=0 and SHFRFICA=0 and SHFRSTTX=0 and SFRLCLTX=0
--Verify number of affected rows against select statement rows

Once we did this, FUTA and SUTA now calculate correctly per the settings in GP.  On an interesting side note, we also found that something in GP might be resetting the TXSHANTY to zero, although we could not figure out what-- we just noticed that over time (with no as zero.  Odd.

Microsoft is working on writing this up as a quality report, but I thought I would share it with you all in case you run across the same oddity!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Wednesday, October 12, 2011

Zero Columns Not Aligning In Management Reporter

We had an odd little issue pop up in Management Reporter.  One of those little oddities that doesn't seem like much, but is annoying nonetheless.  Here is an example of the issue, in a calculated column that is rounded to 1 decimal place:


Notice the problem?  Look closer.  Do you see it?  The 0.0 value does not align to the rest of the numbers.  Why?  Well, the issue is that the 0.0 is actually being rounded from -0.02 (for example).  So when it rounds to 0.0 it is not aligned correctly in the column.  If it was -.06 it would be fine, as that would round to -1.0. 

A small issue, sure, but one that can be quite annoying in presentation-quality financials.

Fortunately, it is fairly simple fix.  Set the Print Control for the column to NP, and then add a new calculated column with this formula:


Replace B with the original column that was set to NP.  This eliminates the issue with the negative value rounding to the out of alignment 0.0.  This is logged as a quality report with Microsoft.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Word Templates and Synchronization

Yet again, the Developing for Dynamics GP blog has saved me!  Having an issue with a field not printing on a Microsoft Word template, my forehead was bruised from beating my head against a desk...and then I stumbled upon this great little article.  What a lifesaver if you are having issues with fields not printing on the template, although they do print properly on the report in GP.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Sunday, October 9, 2011

Did You Know? Extender Setup

Yes, I am one of the those people that read and re-read the training manuals each time I teach a class or train a customer.   So, I was rereading the Extender manual a few weeks ago, and came across an interesting tidbit that many of you may already know.  It is recommended that the Auto-Update SmartLists option in the Extender Options window (Microsoft Dynamics GP-Tools-Extender-Options) only be marked in one company.

This is due to the fact that SmartLists is system wide, while Extender windows are company-specific.  The recommended workaround for this is to mark the option in the "main" company and complete all of your Extender configuration in that same company then export/import the necessary windows to the other companies.  There is a KnowledgeBase article that describes the issue and recommended workaround, find it here.
In my opinion, this is one of those things that I wouldn't have even thought to worry about.  But it can cause a lot of issues including inaccurate/incorrect results in SmartList if you do have the Auto-update option marked in multiple companies.

Would love to hear from anyone who has run in to issues related to this setting, and how you have managed the "develop all windows in one company" approach when Extender is being used extensively in multiple companies in the same installation.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Adding Fields To Word Templates

Sometimes things are much easier than you would even think possible.  I know, I know, hard to believe, right?  But a  great example of this is adding fields to a Word Template in Microsoft Dynamics GP.  So, for example, let's say that I want to add the order date to the SOP Blank Invoice Word Template.

The Report Writer report actually feeds the XML for the Word Template, so for the field to be available in the Word Template, you must first add it to the report in Report Writer (Microsoft Dynamics GP-Tools-Customize-Report Writer)

Add the field to the correct section of the report, although it doesn't matter cosmetically where it appears. You will also want to make sure that you have security to print the modified report, to ensure that it is pulling the proper data (Microsoft Dynamics GP-Tools-Setup-Alternate/Modified Forms and Reports ID, User Security).

Next, open the Template Report Maintenance window, Reports-Template Maintenance.  Click on the Report Name dropdown, and select More Reports to open the Reports menu.

Select the Product (in this case, Microsoft Dynamics GP), the Series (in this case, Sales), and the Status should be Modified.  Selecting Modified for the Status matches the template to the modified report, enabling the additional fields to be pulled on to the template.  Click Select to return to the Report Template Maintenance window.  Then click the New button to create a new template.

You can choose to create a Blank Template, or From Existing Template.  If you choose From Existing Template, select the Template to copy from. Even if you copy from an existing template, the additional fields from the modified report will still be available to be added to the report.   

Next, click Modify to open the selected template in Microsoft Word.  Note that the Order Date field is now available for selection.

Now you can add the field to the template as needed.  Easy, huh?  The downside is that you do need to know Report Writer in order to modify the underlying report so that the field is available in the template.  But the good news is that is a rather simple process to make the new fields available on the template.

Good luck! And feel free to share any other tips and tricks you have run across while working with the Word Templates.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Thursday, October 6, 2011

Join Us At The Dynamics GP Partner Conenctions Conference 2011

Some more information from the fabulous folks at GPUG/GPCC! Please considering joining us in Las Vegas Nov 6-8 for learning and networking with fellow partners, ISVs, and Microsoft!

The Dynamics GP Partner Connections (GPPC) Conference 2011 is a unique opportunity for Dynamics GP Partners and ISVs to build connections, share best practices, pick up new skills, develop new or improved strategies to drive revenue, and meet with Microsoft Dynamics GP leaders and influencers to help prepare for and shape the future of Dynamics GP.
Event: GPPC Connections Conference 2011

Date: November 6-8, 2011

Location: Caesars Palace, Las Vegas, NV

Register: (Save $100 when you sign up by October 1)
The agenda for Connections 2011 includes more than 30 breakout sessions led by fellow GP Partners, MVPs and subject matter experts – including roundtable discussions, interactive workshops and how-to sessions – designed to enhance your GP knowledge and competencies. Additionally, structured networking opportunities are a prime opportunity to develop and strengthen alliances with leading Dynamics GP partners. The event also features an ever-popular Microsoft Town Hall meeting – a rapid-fire Q&A session where you can ask your top GP questions of key Dynamics GP leaders, including Anders Spatzek, Errol Schoenfish, Pam Misialek, Chris Lerum, Chad Sogge, Mark Albrecht and Jeff Hensel.
Who Should Attend?

Dynamics GP consultants, developers, technical sales engineers, practice managers, project managers, and business leaders.
Learn More and Register

You can learn more about GPPC Connections 2011 by visiting the conference website at Here you’ll find the event agenda, registration information, and complete session descriptions. Note that there is a $100 registration discount that ends October 1st – sign up today and book your travel to take advantage of low registration and air fare rates.
Connections 2011 is held in conjunction with GPUG Summit, the annual conference for the Dynamics GP User Group. Come for Connections and stay for Summit, where you'll have a chance to network, learn and mingle with GP users from around the country. Summit is happening Nov 8-11, 2011 at Caesars Palace – visit for more information.

Saturday, October 1, 2011

Virtual Town Hall Meeting for Microsoft Dynamics GP Partners

Hey Partners! Interested in leaning more about upcoming changes to the MPN?  Here is your chance!  Many thanks to GP Partner Connections (GPPC) for arranging this virtual town hall meeting!  "See" you on the call-- Christina

Microsoft Dynamics GP Partner Connections (GPPC) Town Hall Meeting with Microsoft’s Jeff Edwards
Join Microsoft's Jeff Edwards, Director of Channel Strategy, for the next GPPC virtual Town Hall Meeting on Wednesday, October 19 at 1:00pm (ET). In this radio talk-show style call, GPPC Director Kim Peterson will talk to Jeff about hot topics that affect Partners and ISVs. This meeting will focus on the MPN transition to Pay for Performance, which will take effect in January 2012. Dial in to understand Microsoft’s perspective on how the changes are landing and the impacts and opportunities for the Dynamics GP Channel.

You’re also invited to submit your own questions that will be covered in the partner Q&A portion of the call. Please submit your questions to prior to the event with the subject line ‘Jeff Edwards.’

Event: Dynamics GP Partner Town Hall Meeting with Jeff Edwards

Date: Wednesday, October 19

Time: 1:00 - 2:00pm (ET)

Cost: Free


Who Should Attend? All Microsoft Dynamic GP Partners and ISVs are welcome to attend the Town Hall Meeting.
About GPPC Town Hall Meetings: It’s more important than ever for partners to have open connections with Microsoft executives. GPPC wants to play an important role by being an advocate to both partners and Microsoft. That’s why we’ve created this exciting series of quarterly Virtual Town Hall Meetings. Dynamics GP VARs and ISVs are invited to this exclusive virtual event on a quarterly basis. The event will help build competencies in the Partner community and provides a means of keeping key Microsoft executives in tune with the issues and topics partners care about.
About Jeff Edwards: Jeff is responsible for Microsoft's ERP Partner Strategy which includes partner program (MPN) requirements, Partner branding and Partner compensation.
About GPPC: GP Partner Connections (GPPC) is a peer-to-peer, independent professional networking group for Dynamics GP Implementation Partners and ISVs. GPPC is dedicated to increasing the knowledge and competency of Implementation Professionals, Consultants, and Technical Sales Engineers. The group provides a conduit for all parties to hear, listen, plan and move forward together.

Friday, September 30, 2011

Limitations of the Dynamics GP eConnect Transaction Requester

Dynamics GP eConnect has a "feature" called Transaction Requester that, among other things, allows it to compile a list of all records or transactions that have been inserted, updated, or deleted in Dynamics GP.

eConnect Requester fundamentally consists of SQL Server database triggers that monitor changes to the Dynamics GP database tables and record any inserts, updates, or deletes that are made to that table.  When the triggers detect a change, they write a record to the eConnect_Out table.

If you want, you can also use the eConnect Outgoing Service, which can output XML files or send XML to a queue in MSMQ. 

eConnect Requester is a feature that is very valuable for integrations where Dynamics GP is the system of record and data needs to be sent from Dynamics GP to an external system.  For instance, if an external system needs the ability to create purchase requisitions, you may want to automatically send all new or updated vendor records and inventory items from GP to that system so that they only need to be maintained in GP.  Or if you use a third-party logistics vendor or fulfillment center, you may need to send purchase orders or sales invoices to those facilities so that they can receive and ship your products.

This all sounds great, and in concept it is, but there is one significant fundamental limitation to eConnect Requester.  If you look at the eConnect_Out database table, you will see that it really only tells you which table was updated, the index value (document number) for the record, and whether the record was inserted, updated, or deleted.

This approach typically works fine for two types of database records:  Master records, such as customers, vendors, items, GL accounts, etc., and transaction records that are posted and fully committed, such as posted GL journal entries, posted vouchers, posted receivables transactions, etc.

This means that eConnect Requester has some drawbacks with transactions that are not posted, such as Purchase Orders.  Purchase Orders are somewhat unique because they can be created, saved, edited, approved, printed, released, changed, printed again, voided, and deleted.  This is a challenge because eConnect Requester doesn't track these higher level document "states" or status--it only tells you whether a new record has been inserted into a table, whether a record has been updated, or whether a record has been deleted.

This is a limitation / challenge / problem when you have to export purchase orders.  Let's say that you are sending purchase orders to an external 3PL warehouse that will be receiving your inventory.  The warehouse needs to receive any new purchase orders that have been entered into Dynamics GP.  But not every PO.  You probably don't want to send unapproved POs.  And you probably don't want to send unreleased POs.  So when your export process sees a PO record in the eConnect_Out, you can't just assume that it is ready to be exported.  You have to check the status first and determine whether the PO has been approved and released.

So what if the PO is then changed?  Maybe a line is added, another line is cancelled, and one or more quantities are updated.  Your 3PL warehouse needs to know about these changes so that they can receive against accurate purchase orders.  The warehouse doesn't want the entire PO sent to them after every change--they only want the changes sent to them.

Quickly you will learn that eConnect Requester can't tell you about any of those things.  Even though Purchase Order Line (POP10110) is an option in the Requester Setup, it turns out that the Requester only tells you that a PO record in the POP10100 table has been updated.  You have to figure out what changed on that purchase order and whether that change needs to be sent to the 3PL warehouse.  And unfortunately, eConnect requester offers no help there.

One approach is to store a copy of every PO line that has been exported.  That way you essentially have a full copy of the PO and all of its lines and quantities, so if anything changes on the PO, you can compare the old values to the current values to determine what has changed.  This may seem like a fair amount of overhead just to export POs, but it isn't difficult, works well, and has the benefit of serving as an audit trail for the export process, allowing you to verify that records were exported properly.  But, you have to write the queries and code that will perform all of the comparisons, and handle all of the logic to determine what to export when a PO is changed.  Trust me when I say that this can get very, very tricky for reasons you would least expect.

Although this "limitation" does seem burdensome, eConnect Requester can't fulfill all requirements, so it fulfills the fundamental and basic ones, and you are free to develop a solution that meets your specific business requirements.

Overall, eConnect Transaction Requester is a great, reliable resource for automating the exporting new data from Dynamics GP--just make sure that you are aware of the limitations before you jump into a project that requires it.

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Dynamics GP Reminders Make GP Load Very Slowly

This week I have been working with a client who is preparing a new SQL Server in preparation for their Dynamics GP 2010 upgrade.

One strange behavior that he observed is that when he launched Dynamics GP 2010 directly on the SQL Server, it would take over 5 minutes to load.  During that time, the GP window would be blank and have the famous "Not Responding" message in the title bar.

However, when GP was launched on a new Terminal Server, it "only" took about 100 seconds to load.  Since we were troubleshooting other issues with the SQL Server, we thought that the server might be the problem.  But after running SQL Profiler and tracing the Dynamics GP activity during the load process, we discovered that GP was running over 50,000 queries against the database before the application was usable.  Once the queries stopped, GP seemed fine.

We traced the activity when launching GP on the Terminal Server, and the same 50,000 queries were being run, but for some reason it took less than 2 minutes for GP to fully launch on that server.

What originally seemed like a SQL Server issue now seemed to be a GP issue.

Reviewing the queries, we saw that they were against the service contract table, and GP was requesting every single record in that table, one record at a time.  After scratching his head for a while, the client realized that the sa user had a reminder related to service contracts.  Sure enough, after deleting that reminder, GP loaded fine, and loaded quickly.

This issue isn't new, but because the symptoms can vary dramatically from machine to machine, and because reminders for one GP user can be completely different than reminders for another user, it isn't always obvious that reminders are causing the problem.  And when this client's reminder was originally setup, it may have worked fine, but as they added thousands of contracts over the years, it would have gradually caused GP to become slower and slower.

I'm not a fan of the Reminder pop-up window that appears after login anyway, so after seeing how certain reminders can cripple GP, it is just another reason to avoid their general use. 

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Thursday, September 29, 2011

How do you cope with the torrent of email?

Several years ago I was out of town at a conference and I didn't have any internet access during the day.  This was before smart phones were in the hands of every teenager, and before 3G wireless internet cards or personal hotspots were common.  And I was staying at a cheap hotel that at the time didn't provide internet access.  (Gasp! Can you imagine!?!?)

Yes kids, it was the Stone Age.

Anyway, after my conference and dinner, I headed to the nearby Starbucks around 8:45pm to use their (then paid) WiFi service.  I fired up my laptop, got connected to the internet, and then opened Outlook to deal with the day's e-mail.

Over 100 e-mail messages slammed into my Inbox, and then the Starbucks employee informed me they were closing in a few minutes.  I sat in my rental car in the parking lot in front of Starbucks, with a weak WiFi signal, trying to respond to messages.  After a few minutes, I realized it was futile.  That was when I realized that there was a specific limit to the number of e-mails that I could handle in a day.  It's an arbitrary number, but 100 a day is where things definitely start to fall apart.

If I were doing nothing else but handling my e-mail, I could easily handle the messages coming in every 5 minutes.  And sure, many are probably the dreadful CC messages that infect every Inbox.  But the ones that required my attention would still potentially take several minutes to respond to and fill my day.

So now let's head back to the real world where I actually have to get lots of real work done completely independent of e-mail, and where I am not paid solely to pounce on every e-mail message as soon as that little Outlook envelope icon appears in my system tray.

Today I was away from my computer at meetings for 3 hours and I received 30 e-mails during that period.  My iPhone was vibrating constantly during my meetings and while I was driving.  30 messages may not sound like a lot, but I have so much actual work piled up with looming deadlines that even if I didn't touch my e-mail at all, my day (and evening!) would be completely full.  Consider the 75 e-mail messages that I received today, and yesterday, and the day before, and I have a pile of e-mail on my hands.  I've received over 1,400 messages so far this month, and although I know there are people who receive far more than that, managing all of those messages represent a major "time suck" for me.

So why am I writing a blog post instead of doing work?  Valid question.  I thought I would share a few of the practices that I have used over the last year to try and deal with the endless stream of e-mail that I receive.  Collectively, these practices obviously aren't enough to fully resolve my dilemma, but every little bit helps.

1. Unsubscribe.  I receive the usual marketing e-mail from dozens of companies, just like everyone else.  For the companies I have used or liked, I used to tolerate their e-mails thinking that I might find something interesting.  No longer.  Unless I actively receive value from the marketing e-mails, I unsubscribe.  If I don't have time to deal with all of my work or client e-mail, I don't need 10 different digests from Linked In groups that I never read.

2. No More Folders.  I used to fastidiously organize my e-mail into folders by client and sometimes by project.  I had dozens and dozens and dozens of folders.  Naturally, I thought I was just being organized, just in case I needed to find that one message for that one project.  I never realized how much time and mental energy I was expending on that useless task.  Based on the recommendation of a friend, I got rid of all of them.  Other than my Inbox, I now have one folder called "Read", which exists solely to get messages out of my Inbox.  Unless I have to specifically reply or somehow follow up on that e-mail, I glance at it, and it then goes straight to the one and only Read folder to get it out of my Inbox.  If I ever need to find an e-mail, I will either use the Outlook Search feature (not so great), or X1 Professional Client Desktop Search (excellent, but not perfect).

3. Quick Steps:  Outlook 2010 has a feature called Quick Steps (a rebranded version of what is available in earlier versions of Outlook).  It allows you to define a "macro" of sorts to perform organizational operations on your e-mail.  I created a Quick Step called "Read" that will send a message to my single Read folder when I press CTRL+SHIFT+9.  I got so sick of dragging messages to the folder using my mouse that I finally looked around and found that Quick Steps would let me do it with a keyboard shortcut.  Now I don't even have to touch the mouse to file messages with a vengeance.

4.  Categories:  I started using Outlook Categories to try and help me prioritize my e-mail.  Although Categories is a nice theoretical concept, it only works if you actually have time to deal with all of the e-mail that you have categorized.  I can categorize my messages, but when the e-mails arrive faster than I can clear out the categorized messages, it's a futile exercise and the categories just grow and grow.  I'll categorize 10 e-mails, but as I start doing actual work on one of the high priority items (or some other actual work), 10 more e-mails arrive, making my categorized list that much longer.  I end up only being able to deal with high priority e-mails, and the "Follow Up" and "Personal" categories just grow indefinitely and collect dust.

5.  Turn Off Notifications:  After installing Outlook, the first thing I do is turn off the annoying Outlook desktop notification messages that appear every single time an e-mail arrives.  What a horrible feature that is designed to distract people and keep them from doing any productive work.  I then disable the sound that Outlook makes when a new message arrives.  Now I only have the little envelope icon that appears in the system tray, and I'm considering disabling that.

6.  Define "Email Time" and "Email Free Time":  If I were to respond to emails as they arrive, I would be constantly interrupted.  Even if I just have to glance at an e-mail and file it away, it takes may attention away from the proposal, the code, the estimate, the query, the report, or whatever billable task I'm working on.  And if I start to reply to some messages, which is a bad habit I have, then before I know it, 5, 15, or 30 minutes slips by while my real work is left unattended.  Now, when I have to focus on certain tasks, I now close Outlook entirely for several hours at a time.  I also close my web browser, just to minimize the opportunities for distraction.

7.  Smart Phone: When I got my first Blackberry, I thought that it would be handy to send or receive the occasional e-mail.  It was definitely handy, but one thing it made me realize was how much waiting I did when I was out of the office.  I realized that at a minimum, I could file away messages while in line for 2 minutes at Starbucks, or while waiting for my order at lunch, and especially while sitting at the airport.  Even if you don't compose any e-mail or actively deal with a task related to your e-mail, you can at least file away the messages that don't need to be cluttering up your Inbox, saving some time when you return to your computer.

8.  Spring Cleaning:  After sever months of growth, my Inbox will be littered with old threads that I either took care of and forgot to file, or are now obsolete.  Occasionally I'll take time to sort messages by sender, quickly scan them, and then file them en masse, clearing out my Inbox to a handful of messages.  An empty Inbox is a great feeling.  After it's done, I e-mail a few colleagues to brag about how few messages I have in my Inbox--and of course they then reply a dozen times to clutter it up again.

I know there are many tools available for managing e-mail, and even more tips and techniques and preferences, but going back to my original point, when the volume of incoming e-mail exceeds a certain threshold, and when I have lots of actual work to do, it eventually overwhelms me.

How do you cope?

Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

Wednesday, September 28, 2011

Hidden Gem- Check Distribution Report

Do you need a report to show you the invoices that were paid by a particular check?  How about viewing the distributions associated with the invoices that were paid by a particular check?  These are fairly common requests, and often lead to a bit of head scratching for users.  We look in SmartList, and there really isn't a great option (unless you have SmartList Builder-- but even then, you have to create it) since the relationship between a check and invoices can be one to many (one check can pay many invoices).

So when I get this question, and the client doesn't own SmartList Builder or they want a solution that is quick and easy, I point them to the Check Distribution Report.

Reports-Purchasing-Check Information-Check Distribution
Click New Option

When setting up the report option, you can restrict to a range of Vendor IDs, Dates, or Check Numbers.  You can also select to Include Dist Types, to include only PURCH type distributions for example.

Check Distribution Report

The report displays each check followed by each invoice paid.  The distributions appear directly below each invoice that is listed.  This report can be quite handy when trying to track back checks to their associated invoices. 

I know that similar information can be displayed using SmartList Builder, but this report is a great option when you need a quick solution and/or you don't own SmartList Builder.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Adding Employer Tax Fields To Payroll Check Register

We had an issue pop up this month that was a bit puzzling on the surface.  When a client printed the payroll Check Register when the payroll posted, they were getting different results than when they reprinted the same report using Reports-Payroll-Reprint Journals-Check Register.

On the standard Check Register and Reprint Check Register reports, the Employer FICA taxes are totaled in to one field called Employer FICA Owed.  However, on the client's reports, this amount was also broken out in to two additional fields for Employer FICA/Social Security and Employer FICA/Medicare.  Given that these fields are not standard on the report, we knew (and confirmed through Microsoft Dynamics GP-Tools-Customize-Customization Maintenance) that the reports were modified.

In looking at these modified versions of the reports, we noticed that all of the fields matched with the exception of the Employer FICA/Social Security and Employer FICA/Medicare fields that were added to the reports.  But, why would the fields be different?  If the same fields were added to the reports?  Well, that is the big IF.

So, I opened each report in Report Writer, and here is what I found.  There are a total of four fields that store Employer FICA information, two for Social Security and two for Medicare.  When working with the Reprint Check Register report, the fields are located in the Payroll Check History table (for the Check Register report, the table is Payroll Work Header): Employer FICA/Med Tax On Tips, Employer FICA/Medicare Withholding, Employer FICA/SS Tax On Tips, and Employer FICA/Social Security Withholding.

In the client's case, one report was modified to pull only the "withholding" fields while the other report was modified with calculated fields to combine the "withholding" field with the corresponding "tax on tips" field (e.g., Employer FICA/Medicare Withholding + Employer FICA/Med Tax On Tips).  And because the client had employees with tips, the amount displayed for each of the FICA fields was off by the FICA on tips.

So, lesson learned, if you want to add the Employer FICA/Medicare and Employer FICA/Social Security to the payroll check register reports, make sure you either display all four fields or create calculated fields to summarize them appropriately.

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.