Thursday, September 24, 2009
So BlueLock is hosting a 25 user GP system for an international company with offices around the globe, and they are looking to ensure that the performance will be sufficient. GP is working fine, but there is one strange issue: Some SmartLists take forever to run. A Sales Transaction SmartList might take 20 minutes to output 15,000 records. A GL Account list would take 5 minutes to display 10,000 records. But when the equivalent SQL query is run directly on the SQL Server, the query runs in a second or two.
When I timed a few SmartLists on a physical SQL Server, my results were much, much faster than BlueLock's test times. Additionally, when BlueLock ran certain SmartLists, the server IO and CPU utilization would shoot through the roof, and sometimes GP would stop responding.
We then became concerned that perhaps there was some obscure issue with SmartLists in a fully virtualized environment.
After some head scratching, the BlueLock team looked again at the SQL Profiler trace results for one of the SmartLists. Instead of a single query, SmartLists was issuing many different queries, and even calling stored procedures. Reviewing the additional queries and stored procedures, they found that they all related to Analytical Accounting (AA).
They then tested two scenarios:
1) Remove any columns in the SmartList that relate to AA
2) Remove the AA module so that it isn't loaded when GP starts
Apparently removing the AA columns from the SmartList only had a minor effect on performance. However, removing AA rom the Dynamics.set file resulted in a huge improvement.
Here are two stats they recorded:
SmartList with AA columns removed = 10K records in ~20 minutes.
SmartList with AA removed from Dynamics.set file = 10K records in ~25-30 seconds.
So, by removing Analytical Accounting from the mix, the SmartList went from 20 minutes to 25 seconds.
That's quite an improvement.
Tuesday, September 22, 2009
If you are technically inclined but haven't registered yet, it's not too late!
It is a trek and a sizeable investment of time and money, but since I haven't attended any of the major conferences in the last few years, I figure it makes sense to attend this one, which will hopefully focus on the technical topics that I work with daily. It looks like there will be several interesting sessions.
Based on a few recommendations, I have opted to stay at the Fargo Wingate instead of the MS recommended Ramada or Holiday Inn. It's only $69 a night for those frugally inclined, but I'll be getting a rental car--which is fine, as I'll then be able to venture out and experience the culinary delights that Fargo has to offer.
And, I admit, I have been looking for an excuse to cram into a Canadair jet and visit Fargo! In November. Average high: 35 F Average low: 19 F. That qualifies as cold in my book.
Friday, September 18, 2009
For most of the RDP connections to my virtual servers, I choose to have them run full screen on my second monitor. I like to be able to use ALT+Tab and not have to memorize the RDP equivalent keyboard shortcuts for the non-full screen windows.
But I do have a few RDP connections to my physical servers that I don't like running full screen for various reasons. I liked the option to use 1152 x 864 on my 1680 x 1050 monitors, as it doesn't fill my screen, but gives me enough real estate to still launch a 1024 x 768 window.
Well, I recently upgraded both my laptop and desktop to Windows 7. One of the quirks I've found is that for some reason, I can no longer select the 1152 x 864 size for my remote desktop window. I'm sure there's some explanation having to do with video card drivers or monitor INF files, but I figured there had to be a simple way to get my RDP window sizes to be the same as before.
So I decided to open the RDP file in Notepad, and sure enough, I saw that there were some pretty self-explanatory parameters in the file: desktopwidth, desktopheight, and winposstr. Width and height were easy, but the after adjusting the winposstr parameter, I found that there is something else going on, as I couldn't get the windows to display properly (without the scroll bars).
Sure enough, plenty of people have already dealt with this issue, and I found this post explaining how to pad the window position parameters so that they work properly.
With this knowledge, I can now produce an RDP window with any size that I want--even a window that has a height greater than its width if you want to get fancy. I can also now specify the exact location that I want the window displayed, which is handy as well.
Thursday, September 17, 2009
GP calculates overtime HOURS in a fairly simply fashion. You can specify a threshold for the pay period, and hours that exceed that threshold can automatically treated as overtime. Okay, fine. So let's look a the practical implication of that.
- Biweekly Pay Periods
- Automatic Overtime threshold set to 80 hrs
That works fine if the employee works 82 hrs in the pay period, it will automatically treat the 2 hrs over the threshold as overtime. But what about if the employee works only 80 hrs in the pay period, but it included one day that was 14 hrs long? In most states/jurisdictions, overtime is judged on a daily basis (or even sometimes on a duration basis, meaning if it crosses days if it exceeds a certain span of hours it is overtime) and on a weekly basis, but rarely on a pay period basis (in my experience). So GP does not recognize these measures, only the measure of hours per pay period.
So what to do? Yes, there is Overtime Rate Manager. But that only deals with the calculation of the rate used to pay the overtime, not the calculation of how many hours are to be paid. Every time a customer mentions overtime, I cringe, to be honest :) And I do the requisite third party product search. During that search, inevitably an ISV says "that would be a great product, but no, we don't have something that does that".
So it seems that I am left to hoping that the client's timeclock/timekeeping/timecard system will address the overtime calculation (and many do) but what about those that do not use a third party time solution?
Oh, and yes, there is customization of course. But I am still holding out for a simple solution :) Anyone want to comment and save me?
Monday, September 14, 2009
1. Document Default
2. Line Item Default
3. Tax Schedule Comparison
So, step one works pretty much the same as it did in our earlier post. However, we need to take it a step further by considering the inventory site, like the following examples (assuming you are registered for inventory) in Sales Order Processing:
- Shipping Method Delivery: Default tax schedule assigned to Customer's Ship To Address
- Shipping Method Pickup: Default tax schedule assigned to the Site
Now, although I call this a "document" default, because Shipping Method and Site can vary by line item in Sales Order Processing and Purchase Order Processing, this default actually is stored by line item.
So, what about Step 2? Well Step 2 takes in to consideration the inventory item card (or the Sales Order/Purchase Order Setup when dealing with non-inventory items). There are three possible settings for inventory items for Sales and Purchase taxes:
- Nontaxable: No tax is ever charged for this item
- Base on Customer/Vendor: This will charge according to the shipping method
- Taxable: Specify a tax schedule that represents all possible tax details to be charged on this item, more on this later...
So, let's work through a couple examples. First, let's look at the Base on Customer setting. Let's assume that we have entered an invoice for our customer ABC AUTO. They want all items to be shipped to them (Shipping Method- Delivery) at their Missouri location which is assigned to the tax schedule MOTAX which includes the MOCITY and MOSTATE tax details.
Let's say that ABC AUTO buys two items from us, the first item is a gift basket, BASKETOFUN. BASKETOFUN is set with the sales tax option of Base on Customer. So what does that do? If we were to click the item number expansion arrow for this line in Sales Transaction Entry, we would see the following information being stored in the Sales Item Detail Entry window:
- Shipping Method: Delivery
- Ship To Tax Schedule: MOTAX
- Item Tax Option: Base on Customer
- Item Tax Schedule: n/a
Therefore, in this example, all details in MOTAX would be calculated. Now, let's shake it up and say that this particular line item changed to a Shipping Method of Pickup. Let's say that this line item is being sold from our NORTH site with the tax schedule NYTAX. We would then see the following in the Sales Item Detail Entry window:
- Shipping Method: Pickup
- Site Tax Schedule: NYTAX
- Item Tax Option: Base on Customer
- Item Tax Schedule: n/a
So, in this scenario, what taxes would be calculated? All details in the NYTAX schedule would be calculated. Sometimes this scenario is confusing to users, because the item is set to Base on Customers. But what that really means is that the item's taxability is based on the Shipping Method itself, and how the tax schedule defaults.
But, wait, I said that we sold two items right? So let's look at the next item that ABC AUTO bought. They bought our new virtual gift basket, VIRTUALBASKET. Now, some states tax the virtual gift basket, and others do not. So we have set up the VIRTUALBASKET with a sales tax option of Taxable. We have then assigned a special tax schedule to it called VIRTUAL, which includes the taxes that are calculated on virtual products. In our case, let's say that it includes the MOSTATE, KSSTATE, and COSTATE tax details.
If we look at the Sales Item Detail Entry, we would see:
- Shipping Method: Delivery
- Ship to Tax Schedule: MOTAX
- Item Tax Option: Taxable
- Item Tax Schedule: VIRTUAL
Which taxes would be calculated? Only those that are common between MOTAX and VIRTUAL, in this case the MOSTATE detail would be calculated.
But, let's mix it up and say that the customer has asked that this item be shipped to their California location which is assigned to the CATAX schedule (which includes the CASTATE and CACITY tax details). So, let's look a the Sales Item Detail Entry again with these details:
- Shipping Method: Delivery
- Ship to Tax Schedule: CATAX
- Item Tax Option: Taxable
- Item Tax Schedule: VIRTUAL
Which taxes would be calculated in this scenario? None, because the CATAX schedule does not have any details in common with VIRTUAL. This is correct, since we said that our item was only taxable by Missouri (MOSTATE), Kansas (KSSTATE), and Colorado (COSTATE) not by California.
So, the Taxable option for items creates the most flexibility when you have items that are taxed in some states and other items taxed in all states (or different states). I have found this comes in to play quite often with technology products, including software downloads, where some states are more aggressive in taxing these items than others.
Clear as mud? Share you questions, hints, etc and I am happy to update the post--from Alaska, where I am this week teaching :)
Tuesday, September 8, 2009
If there are any readers out there who are "GP Developers", I'd like to hear what languages you use to customize or integrate with Dynamics GP. Specifically, do you use VB.NET, or C#?
I used Visual Basic since the mid-1990s, so VBA and VBScript support in GP didn't require any transition for me, except to learn the quirks and proprietary objects, methods, and events. But I procrastinated with .NET, and finally made the transition in 2007 (I know, I know). Naturally, I just moved to VB.NET, which allowed me to focus on .NET, and not worry so much about language syntax. I think that was probably wise, as I'm constantly reminded of how amazingly expansive the .NET framework is--sometimes it can be tough just navigating through it without having to worry about language syntax.
However, in college I spent a year learning and developing with C++ and have studied C#, so the language and syntax shouldn't be terribly challenging, in concept. (I'm sure I'll accidentally type Dim quite a bit at first.)
As I've been getting help from my friend Lorren to better understand practical Object Oriented (OO) application architecture, and how to best design OO apps in .NET, I'm seeing that besides a few minor syntax and language benefits that I've known about, there are some fundamental benefits to C# that probably make well designed OO code more straightforward.
Most recently, I discovered Class Constructors in VB.NET. They've always been there, but until Lorren explained them to me, I had never used them. My interpretation is that VB.NET essentially hides the constructor from you, which can be convenient. But the downside is that when you want a constructor, you have to do silly things like declare "Public Sub New()" in your class, which is not terribly intuitive. We also discussed how VB Modules appear to provide the same functionality as a Static Class in C#.
So my very nascent understanding is that VB had to create workarounds in the name of convenience or simplicity, and those workarounds result in some potentially less-than-optimal implementations, or implementations that obscure the mechanics of .NET to the developer.
Aside from the specifics of the language itself, one of my concerns around developing in C# has been what I perceive as a relative shortage of C# developers vs. VB developers. If there are benefits of developing in C#, but I have a harder time finding a contractor to help me modify or write code, the benefits may not be worth it. So far, all of the GP developers I've spoken with use VB.
I'm sure there are plenty of C# developers on the planet, but are there many Dynamics GP developers who are proficient in C#?
I don't know.
If you code for Dynamics GP, what language do you use? Or if you know GP developers or GP partners or GP ISVs that do GP development, what language do they use?
Friday, September 4, 2009
No??? So it IS only just me??? (blush)
I discovered the web site several years ago when I was working on a pretty challenging project. My client was receiving detailed billing data from a vendor and had asked me to develop a routine to validate the transactions against data in their database before importing the voucher into Dynamics GP. The challenge was that the vendor was sending over 250,000 records each week, and the single table in the client's operational system database was 2 terabytes.
So there I was, trying to do a JOIN of 250,000 records against a 2TB table. Not pretty. My initial query would just run, and run, and run, while hogging CPU and memory on the server. That's when I stumbled on the SQL Server Performance web site. I optimized my JOINs, I added more WHERE clauses, tweaked my indexes, and I learned all about "non-sargable" WHERE clauses. My eternal query eventually took less than 10 seconds to run, making me quite proud. ;-)
Since then, I've always been aware that whenever I write a complex SQL query, there is usually a different, and likely better, more efficient way to write it. If the query is only run once a week and it takes 10 seconds, that may be adequate and no further optimization may be necessary. But if the query is run 50,000 times a day, even a sub-second query may benefit from further optimization.
One of the biggest learning curves I had was how to analyze the performance of my queries and determine which part of the query was inefficient. I found a book on SQL Server query optimization that taught me the "Old School" method, using the SET SHOWPLAN_TEXT ON statement. When you run this statement, and then run your query, SQL Server will display the execution plan instead of your query results.
For example, these statements:
SET SHOWPLAN_TEXT ON
SELECT COUNT(*) FROM PM00200 WHERE VENDORID LIKE '%ACE%'
Would display this result:
|--Index Scan(OBJECT:([TWO].[dbo].[PM00200].[AK6PM00200]), WHERE:([TWO].[dbo].[PM00200].[VENDORID] like '%ACE%'))
(Just make sure to run SET SHOWPLAN_TEXT OFF when you are done.)
I eventually got past the arcane presentation of the text plan information, learned how to review the operations being performed, and found that I pretty much needed to look at the inner-most (most indented) operation to see where I had opportunities to optimize my query.
On the other hand, most people use the more modern and convenient "Display Estimated Execution Plan" feature in Query Analyzer and SQL Server Management Studio. This feature displays a graphical representation of the query plan and includes much more information and statistics on each operation. If you highlight a query in Management Studio, you can use the CTRL+L shortcut to show the execution plan.
And if you hover your mouse over one of the steps, you will get more detailed information about the operation.
It isn't always obvious or easy identifying which operations can benefit from optimization, but an easy way is to start with the step that has the highest cost and start researching to see if there is a different way to perform the query step.
I learned that query optimization was much more of an art than a science, requiring situation-specific analysis, and consideration of the environment, database, and tables that are involved with the query. But based on my experience, my general rule is that if you have a query that is taking longer than 10 seconds to execute, it is likely that there are opportunities for improving its performance.