Friday, May 28, 2010

Clearing Existing Field Values Using eConnect

Let's say that you have developed an eConnect customer import for Dynamics GP. It works great, importing new customer records quickly and efficiently. Because customers occasionally change their address, phone number, e-mail address, or other contact information, you set your customer import to use the UpdateIfExists flag to update existing customer records.

What's the catch with this technique?

Well, if you are using the eConnect serialization assemblies, you will find that if you set a property to an empty string, such as rmCustomer.ADDRESS2 = "", if the existing customer already has a value for ADDRESS2 in GP, it will not be cleared from the customer record.

Chris Roehrich has an excellent article on the Developing for Dynamics GP blog explaining why this happens and how to properly clear the field using the CDATA tag.

Chris gives a simple example of how you might serialize your eConnect XML, and then insert a new node for the field that you want to clear out, and then set the value of that node to the CDATA[ ] value.

It's a good example for demonstration purposes, but yesterday, when I realized I had to implement the CDATA trick, I wasn't keen on serializing the XML, and then manually inserting an arbitrary number of nodes back into the XML. I had at least 15 different fields that could be blank, and I didn't want to have to write statements to manually insert up to 15 different nodes into an XML document. Me, lazy? Guilty as charged.

After sleeping on it, I came up with an idea that I think is a more elegant solution. This trick is one that I learned back in 1994 while writing custom macro "programs" in Word Perfect 5 for DOS--when your tools are very rudimentary, you come up with some creative workarounds to meet your requirements. It is a technique that is also sometimes used with complex Regular Expressions, and it has come in handy several times over the years to make complex string problems much easier to solve.

Here's my thinking: Instead of using the eConnect serializer to output XML, and then manually insert missing nodes back into the XML, why don't we let the serializer do its job and output all of the customer nodes? "Well, duh,", you might say, "because if we leave the nodes blank, the serializer will not output them!".

True. But who said that we needed to leave them blank?

Instead of assigning an empty string to a property when the field value is blank, I shall instead assign a 'dummy' value to the property. In my project, I arbitrarily chose "~~~", because I'm pretty confident that my customer data will not contain 3 consecutive tildes. "^^^" or any other very unlikely combination of characters would work just as well.

So, I first setup a string with my dummy value, that I've called "clearValue", and also setup a string called "cdataValue".

I then add an if statement to my customer import routine, before the serialization process, that will replace empty properties with my dummy value.



I don't need to know, or want to know, anything about the specific node names or XML structures, and I don't want to create or insert or append any nodes. I just set the property values to my substituted version of an empty string.

Now that the "blank" values have been set, I serialize the XML to a string as I would normally do, and we get this stunning work of art:



After admiring that for a second, I then use the Replace function to replace all of the dummy values in my XML with the CDATA value.


transactionXML.Replace(clearValue, cdataValue);


Bam! All of my blank fields / nodes now have the CDATA value.



And when I sent this lovely morsel of XML goodness over to eConnect, it does its magic and duly clears out any fields for nodes having the CDATA value.

For any disbelievers out there, here is a screen shot before and after.





Works pretty slick, if I do say so myself...

Wednesday, May 26, 2010

Whozit, Whatzit, Forecaster?

When newcomers are learning Forecaster, whether it be consultants or end-users, I think one of the hardest concepts to grasp is the relationship of Forecaster, Users, Data Sources, and FRx. So I thought I would take a moment to break down each of these components and their relationship to each other. Let's start with a simple diagram to overview the applications and databases involved.


In the beginning there is a Forecaster database. Wait a sec. First, we have to create the database. So after installing Forecaster, you will see that the Forecaster Database Creation Wizard is also installed. This tool is used to create the empty Forecaster database. During the creation process you will be asked to identify the owner of the database. Users who are assigned to the DBO role for the Forecaster database can add periods within the Forecaster application, under Setup>>Segments>>Periods.


For data access, input, and reporting we have the two options in blue- the Forecaster website and the Forecaster client. The Forecaster client offers some additional capabilities for administrators, so we often use this for the finance department while the individual budget owners access Forecaster for input and reporting through the website. In both cases, the following are some key points to remember:



  • Data sources for both the website and the client are specified in the configuration of those applications not in the computer's Control Panel data sources. For the client, the settings are entered when you launch the application and are stored in the forecaster.ini for the user (see my other blog post on the Forecaster.ini).

  • To log in to Forecaster using either of this routes, you must be configured as a Forecaster user. This means that you have been set up Setup>>Security in Forecaster, and then added as a SQL Server User with access to the appropriate Forecaster database (and assigned to the Forecaster and Public roles for that database) using SQL Server Management Studio. Both of these steps MUST be completed, and the login specified in both locations must match, or you will not be able to log in.

Okay, so now moving on to the right side and our option in pink, FRx Report Designer. This tool is primarily for reporting, but it also is used by the ExpressLink capability in Forecaster to pull GP data. A few points regarding FRx and Forecaster:



  • FRx uses the Dynamics GP data source to access GP data for reporting. You must configure the DirectLink functionality of FRx to pull Forecaster data for reporting. You do this by setting up each Forecaster company in FRx using a data source for the Forecaster server (you can specify the database in the FRx setup, so you do not need a separate Forecaster data source for each Forecaster database). This data source is set up in the computer's Control Panel under Administrative Tools>>Data Sources. This data source must use a user that has been properly configured (per my notes above) to use Forecaster. This is detailed further in the very thorough KB article on how to configure DirectLink.

  • In addition to being able to pull data for reporting in FRx, the ExpressLink functionality of Forecaster relies on FRx (the Forecaster client must be installed on a computer with FRx for ExpressLink to function) to pull data in to Forecaster. This wizard-based approach (Tools>>Import>>ExpressLink) allows you to pull actuals in to Forecaster for budget -vs- actual reporting. Why do budget -vs- actual reporting in Forecaster? Well, remember the Forecaster website? Now you can give you budget owners budget -vs- actual reporting easily with no distribution process for you.

So those are the pieces of the puzzle as I see them. As always I look forward to your posts and questions. Have a great second half of the week :)

Friday, May 21, 2010

Buyer (or Surfer) Beware!

I have witnessed a few incidents in the past week that have reminded me of the power (both negative and positive) of the internet when it comes to troubleshooting and training on software in general, and Microsoft Dynamics GP in particular.

The first scenario involved me stumbling across the Summit Group Software Blog that just so happened to have a post on the recent 941 changes for US Payroll. Coincidentally, I had just seen an email chain internally regarding it and the blog provided some great information I could pass along. It sort of made my day, finding such useful information from a group I know and trust.

The second scenario involved a post that was forwarded to me by a coworker (that originally was forwarded to them by a customer). The post was on a message board that I won't name and contained a number of factual errors about Microsoft Dynamics GP. Looking through the comments on the post, most of the contributors took the post to be factually correct and expressed a range of negative views. Only at the end of the many many comments was there a voice of reason trying to clear up the many mistruths and errors that had been propagated by this one post and the ensuing comments. Not me, but a fellow blogger whose identity I will protect :) I use this as an example, but this sort of thing pops up for me on at least a weekly basis-- either forwarded to me, or when I doing my own research.

So, how do you sift through all of the information out there (including this blog)? Well, here are my thoughts, please share yours. And I want to add that these should be applied to everything you read (this blog included).
  • If the information is overly negative or overly positive, take it with a grain of salt. If the post is something that seems to be advocating one way or another, keep in mind that the author may have their own objective when posting.
  • Look for specifics, step by step examples, with the expected outcome. This shows you that the author has "done their homework" and isn't just working on assumptions.
  • Test, test, test. If someone puts forth an idea or issue, test it out. Even on the post I mention above, although I knew it to be incorrect, I tested it all out. It took 15 minutes, but I gained the confidence in what was/wasn't correct.
  • Look for signs of credibility. This can be a tough one, but if someone is certified or an MVP, MCT, etc you can have some confidence that the information they are putting forth is factually correct. If you subscribe to blogs or other newsletter, you will start to see the same names pop up as resources...Victoria, Leslie, Steve, David, Mark, etc...
  • Start with known sources like the User Guides, Training Manuals, and the KnowledgeBase on CustomerSource and PartnerSource. I know, I know, this information is not always perfect but it has been vetted by the experts and is a great starting place.

Feel free to share your own tips :)

Thursday, May 20, 2010

Another Deep Dive Now Available! Management Reporter!

Earlier I had posted about the What's New Deep Dives we have completed for Microsoft for Dynamics GP 2010. Happily we just posted another one on Management Reporter. Check it out! The magnificent Craig Anderson of MSX Group developed and presented this topic. When it comes to Management Reporter, Craig and MSX Group are THE source. So we were thrilled to have his subject matter expertise on board for this topic!

Wednesday, May 19, 2010

eConnect 10 SP4 "Access is denied" error

When trying to install eConnect 10 Service Pack 4 on a Windows Server 2008 R2 server, the service pack install started, but before it completed, I received the following error message:



Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))


Since this was on a Server 2008 machine, I immediately suspected that User Account Control was causing the issue.

So I tried right clicking on the service pack file so that I could select the deliciously omnipotent "Run as Administrator" option...but...

Heavens to Murgatroyd! The Run as Administrator was not listed when I right clicked on the ".msp" service pack file. Lovely. I'd seen this before, but forgotten about it.

Fortunately, like all good security measures, there's no need to kick down the front door when there's a rickety screen door in back.

So I then found my good buddy the Command Prompt icon in the Start Menu and right clicked on it, and then selected Run as Administrator.



With my fancy new C prompt, I then did a 'cd' over to the directory containing the service pack, typed in the name of the msp file, and it finally obeyed. The installation went fine from there.

Tuesday, May 18, 2010

Dual Personality: Inventory Transfer Entry

What do you think of when I say "Inventory Transfer Entry"? If you are like most folks, you think of the window that allows you to transfer an item from site to site. But, actually, that window is Item Transfer Entry (Transactions>>Inventory). Inventory Transfer Entry is actually a project accounting window (Transactions>>Project) with a variety of uses.

If you use Project Accounting with Inventory, you are most likely familiar with this window as the next step after receiving an item associated with a project. The purchasing and inventory process for project accounting looks something like this:

  • Record purchase order for inventory item and associate it with a project and cost category, which increases the committed cost on the project budget.
  • Receive the purchase order, which decreases committed cost. Item is received in to inventory (not on to the project).
  • Transfer item from inventory to project using Inventory Transfer Entry window.

This process involves a bit of a learning curve for new users, since the receiving process receives the item in to inventory NOT on to the project. It is the inventory transfer transaction that moves the item from inventory to the project. For non-inventory items, the receiving process DOES post the item directly to the project.

But what if you do not use inventory? Why would you care about the Inventory Transfer Entry window? You can use it to "transfer" non-inventory items to a project. Look at this example:

  • Payables transaction entry used to record an expense for office supplies.
  • After posting the payable, it is discovered that the transaction should have been associated with a project.
  • Use Inventory Transfer Entry to "transfer" the non-inventory item to the project.

This can be very useful when "fixing" issues with costs that were not posted to projects properly.

Inventory Transfer Entry has two different transaction types, Standard and Return. Standard is used for the scenarios above, to transfer an item TO a project. A return can be used to transfer an item FROM a project (thereby reducing the project costs). This can be very useful when an item (inventory or non-inventory) is posted to the wrong project. Consider this:

  • Item 123 is a non-inventory item that has been received to Project ABC.
  • It is then discovered that it should have been received to Project XYZ.
  • Record an Inventory Transfer Return to remove the item from Project ABC.
  • Then record a Inventory Transfer Standard to add the item to Project XYZ.
  • Easy peasy fix!

Well, hope you enjoyed some ramblings on the Inventory Transfer Entry window in Project Accounting :) I just wanted to drive home the point that the window can serve a variety of purposes, not just the typical inventory item transfer scenario.

Monday, May 17, 2010

eConnect "System Error", Part D'eux

Back in January I wrote about some fun I was having with eConnect 9 and the dreaded "System Error" message. I must really, really like that error; it certainly seems to like me.

Back then I made sure that my eConnect calls had a SqlException error handler, and things worked fine. Until, of course, I got the System Error yet again on a recent integration with eConnect 9.

I actually developed the integration in GP 10, and it worked fine, but then realized the client was using GP 9, so I quickly made the simple changes required to convert it to GP 9. After the first successful test import, the error started to occur.

Staring at the error in disbelief (Why me???!!!!), I checked my try / catch block and confirmed that I did have the SqlException handler. I checked my document XML, I checked my connection string, I tried adding and removing different fields from my document, but nothing worked: System Error.

So then I copied the project over to a new server and tried it, and it worked! So then I was really puzzled, as the issue seemed machine specific.

Having run out of ideas, I went to check the PartnerSource support KB to see if there were any hints. And miraculously, I found the gem that is called KB 943133.

When I first started reading the knowledge base article, and it just mentioned including a SqlException exception, I started to get disappointed, since I already had that.

But, after reading the sample code in the article, I found something new.

Within a SqlException exception, there is a collection of SqlClient.SqlError error objects. When my code hit my SqlException, I was just trying to output "ex.Message". I was not looping through the error objects and getting the multiple potential messages, like this:

catch (eConnectException ex)
{
Console.WriteLine(ex.Message);
}
catch (System.Data.SqlClient.SqlException ex)
{
foreach (System.Data.SqlClient.SqlError myError in ex.Errors)
{
Console.WriteLine(myError.Message);
}
}
catch (System.Exception ex)
{
Console.WriteLine(ex.Message);
}


The code in red was what I was missing.

Once I added this, the System Error disappeared and I got a meaningful error indicating that I had a duplicate primary key error. I had manually deleted some test records via SQL in my TWO database while testing, and I missed a table, so eConnect was failing due to a duplicate insert. Which is why the error only happened on the first server, but not the second.

Once again, this is pretty simple stuff that was actually posted on a few sources and widely available, but since the standard ex.Message had been working for me, I never bothered to look further at any other samples.

Since I can't recall running into the System Error with eConnect 10, my impression is that this is primarily an issue with eConnect 9. But now that I have the code setup properly, hopefully that solves the issue, permanently.

System Error, if I ever see you again, it will be too soon.

Friday, May 14, 2010

Making SmartList Export to Excel Faster

Patrick Roth has an interesting post over at Developing for Dynamics GP that discusses a nice undocumented and unsupported feature that allows you to export SmartList data to Excel much faster than normal. There is the potential drawback of losing some formatting, so make sure to test for that, but given the performance improvement, I'm guessing that re-applying some formatting in Excel will not be an issue.

This reminds me of a client that used GP 7.5 with Excel 2003 to export hundreds of thousands of rows from GP. Due to the 65,536 row limit in Excel 2003, I developed a simple VB app that would let the client export the data to multiple CSV files that each had no more than 65,000 rows.

But if you are exporting more than a thousand rows from SmartList on a regular basis, I would encourage you to look into Excel Report Builder, listed quietly under the SmartList Builder menu in GP. It uses a more modern approach to GP data access with Excel that has some nice benefits for Excel reports you use regularly.

Thursday, May 13, 2010

Forecaster.ini...or maybe outie?

With a 4 month old in the house, I can't resist a belly button joke :) I wanted to take a moment to share a little shortcut concerning managing Forecaster 7.0 applications. The installation of the Forecaster 7.0 client is a pretty simple affair, a few clicks of Next>Next>Next and you are done. Then you can apply the service pack and be ready to go fairly quickly.

And let's assume that you want to connect to existing Forecaster databases. So you launch the Forecaster client from Start>>All Programs. But there are no companies listed other than the Demo database. That is because the connection information is actually stored locally for each user. From the login window, you can click Manage Applications to enter the connection information (Name, SQL Server, and Database) for each company you wish to open in Forecaster. You can also test the connection from this window.

But, let's assume that you have 10 different Forecaster databases you want to use. And let's say that you have installed the Forecaster client on a terminal server, and there are 10 different users that will need to access it. That amounts to setting up 100 different connections since the connection information is stored locally for each user.

There is an easier way, though. Just copy the Forecaster.ini to the profiles of the other users. This file contains the connection information and can be found in
C:\Documents and Settings\\Local Settings\Application Data\Microsoft\Forecaster
(for Windows Server 2008 or Windows 7, C:\Users\LOGIN\AppData). It makes managing the connections simple, and ensures that connection information stays consistent for all users.

A simple little fix, and I even make a backup copy to keep for new installations.

Wednesday, May 12, 2010

"Real Time" Dynamics GP Integrations

I've had several inquiries lately about how to create a "real time" integration between Dynamics GP and another system, and I've worked on several integrations over the last few months where clients wanted data integrated as quickly as possible.

I think there are at least three critical misconceptions around "real time" integrations, so I thought I would share some thoughts on the topic. By no means do I believe that I am "right" or that this is the only philosophy, but based on my experience and a few battle scars, I think my approach is pretty well grounded.

To give the discussion context, let's use an example. Acme Sales uses Dynamics GP, and a third party point-of-sale (POS) retail system. Dynamics GP is the system of record for all inventory items, and all item/site records. Whenever a user adds an inventory item to Dynamics GP, Acme would like to have that item automatically added to the POS system. Also, whenever a user adds an item/site record to Dynamics GP, Acme wants that information sent to the POS system so that the item can be sold at the appropriate stores.

During an initial meeting, the Acme sales manager tells the accounting and IT staff that as soon as either an item or item/site record is added to Dynamics GP, those records need to be transferred immediately to the POS system so that the item can be sold. The IT developer thinks "Okay, I'll create a real time integration between GP and the POS."

Naturally, the first thing the developer does is to put code on the Dynamics GP Item Maintenance window to detect when a new item is created. Simple, right? Just detect when the save button is clicked, and he can then immediately transfer the information to the POS. He then finds the Item Quantities Maintenance window and tries to put code there to detect when new Sites are assigned to Items.

He soon realizes that this approach may not work so well. Once a month, the sales department receives a list of 500-1000 new or changed items from one of their distributors. This list is imported into GP using an eConnect integration. Monitoring the Save event on the Item Maintenance window won't work, so he's not sure how he'll handle those items.

Next, while trying to put code on the Item Quantities Maintenance window, he realizes that the Site ID field is, well, not your average windows control. And the events to detect the Item + Site combination are more complex than he realized.

While testing his prototype, he realizes that the connection between the GP workstations and his development POS environment is not always 100% reliable. Sometimes the code on the GP window can't complete, causing the GP user to experience a long pause or hang, and sometimes the POS update fails with an error. Not only does this display a confusing error to the user, but it also means that the record saved to GP was "lost", and didn't make it to the POS system.

Realizing that the POS update might sometimes fail, he then embarks on an additional project that will synchronize the GP records with the POS records. And so he begins to slide down the steep and slippery slope of a "real time" integration.

Given this story as background, here are my interpretation of three critical misconceptions around "real time" integrations.

1) There is no such thing as a "real time" integration. This isn't a technical distinction, it's a business requirements distinction. Does Acme really need an inventory item, that took a user 60 seconds to enter into GP, transferred over to a POS system in less than 1 millisecond after the user clicks on the Save button? I seriously doubt it. If it took 1 second for the item to transfer to the POS, would that be acceptable? How about 30 seconds? How about 5 minutes? My point is that if you completely eliminate the Cartesian notion of "real time", you are left with a much more practical concept of latency. You will always have some latency in your integration--the question is how much latency is acceptable for your integration from a business requirements perspective? For the vast majority of businesses, I would argue that most integrations will work fine with at least 2-5 minutes of latency, and often much more. I prefer to use the term "near real time" integration, as it acknowledges that there will always be some latency.

2) A user interface is a terrible place to put integration code. When you are building a near real time integration, there seems to be a tendency to focus on the source of the data, capturing it as soon as possible so that it can be transferred. This leads developers to want to capture the data as it is being saved on a window. Give this a try in Dynamics GP and I think you will quickly see how difficult it is to reliably detect these types of events. What if the user clicks on the "X" to close the window and then clicks on Save on the resulting dialog instead of clicking the main Save button? What if the user clicks on the "next record" VCR button in the bottom left corner of the window and gets the Save dialog? You will learn that in Dynamics GP those two save dialogs can be different. And dont' forget that sometimes the same data can be modified in two different windows, like the Customer Maintenance and Customer Address Maintenance windows! And as in the Acme example, what if data is inserted without using the user interface? How will you detect new records? Instead of looking to the user interface, focus on the destination: the SQL database. No matter what the user does in the user interface, the data all ends up in SQL Server in one or more tables. Table activity can be easily monitored using the eConnect Requester or custom triggers.

3) Synchronous integrations don't work. As soon as you develop a complex integration that assumes that an entire integration process can be completed successfully in sequence, you will learn that there are inevitably going to be interruptions to that process. What if the network is flaky? What if the remote server or database or web service is down? What if a password is changed or directory permissions change causing an authentication failure? What if the process takes 5 seconds to complete instead of a half second? In these situations, a synchronous integration can either fail or timeout, potentially causing all sorts of problems like forcing a user to wait, throwing error messages, or causing data to be lost. It is best to always use some type of asynchronous step in the process that can handle an interruption at some point in the integration. This typically involves some type of queue that can store the data until it can be reliably integrated. In the case of the integration to the POS system, maybe that would be a database table that serves as a queue to store new item numbers or item / site records that have been added to GP (such as the eConnect_Out table). Or maybe you write the data out to a CSV file that can then be read by an import process on the POS system. There are also formal queueing platforms, such as Microsoft Message Queue, but these are often not necessary or are overkill for most Dynamics GP integrations.


Hopefully this discussion of these three misconceptions help you understand what to avoid in your integrations, since getting past these common pitfalls will save you alot of time and frustration. Once you get past them, there are many different design options that you can use to create some fast, robust, and low maintenance integrations.

Monday, May 10, 2010

SQL Server APPLY operator

If you would love two scoops of yummy SQL Server query goodness, the APPLY operator is for you.

As I've said before, rarely a day goes by when I don't say "D'oh!" in surprise (and embarrassment) at being unaware of a tool or technology that I probably should have known about, but just didn't.

The SQL Server APPLY operator is one of those gems that I should have known about 5 years ago. I really need to start buying those "What's New In SQL Server" books, and actually reading them.

Imagine this Microsoft Dynamics GP scenario: You need to query vendors and include the vendor's PRIMARY address information, as well as their REMIT TO address information. But it's not quite that simple (it never is).

You find that the client has inconsistent vendor address IDs in their database. Sometimes they just use the address ID of "REMIT", sometimes "REMIT TO" and sometimes "REMIT ADDRESS", among others. And naturally, there are some vendors who don't have a separate REMIT TO address at all. In theory, some vendors may have two or more remit addresses, so you have to handle that situation as well--in which case you only want the TOP 1 remit address record.

While trying to figure out how to write such a query, I came across this article that discussed the OUTER APPLY operator.

Think of it as an OUTER JOIN, but without needing the ON keyword and subsequent matching columns from the two tables. It's also a bit like a subquery, but with the flexibility of a JOIN.

In short, it will make your inner SQL geek salivate.

Here is an example query that retrieves basic vendor information, along with the TOP 1 primary and remit to address records, allowing me to reference the fields in my query through a derived table.

Note that I used the OUTER APPLY version, since I want to return the vendor information even if there is no primary or remit to address. If you only want to return records that have a match in your apply operation, you can use CROSS APPLY, which works like an INNER JOIN. (hence the two scoops of yummy goodness)


SELECT
RTRIM(v.VENDORID) AS VENDORID,
RTRIM(v.VENDNAME) AS VENDNAME,
RTRIM(ISNULL(vpa.VNDCNTCT, '')) AS CONTACT,
RTRIM(ISNULL(vpa.ADDRESS1, '')) AS ADDRESS1,
RTRIM(ISNULL(vpa.ADDRESS2, '')) AS ADDRESS2,
RTRIM(ISNULL(vpa.CITY, '')) AS CITY,
RTRIM(ISNULL(vpa.STATE, '')) AS STATE,
RTRIM(ISNULL(vpa.ZIPCODE, '')) AS ZIPCODE,
RTRIM(ISNULL(vra.ADDRESS1, '')) AS REMITADDRESS1,
RTRIM(ISNULL(vra.ADDRESS2, '')) AS REMITADDRESS2,
RTRIM(ISNULL(vra.CITY, '')) AS REMITCITY,
RTRIM(ISNULL(vra.STATE, '')) AS REMITSTATE,
RTRIM(ISNULL(vra.ZIPCODE, '')) AS REMITZIPCODE,
RTRIM(ISNULL(vpa.PHNUMBR1, '')) AS PHONE1,
RTRIM(ISNULL(vpa.PHNUMBR2, '')) AS PHONE2,
RTRIM(ISNULL(vpa.FAXNUMBR, '')) AS FAX
FROM PM00200 v
OUTER APPLY (SELECT TOP 1 * FROM PM00300 vpa WHERE VENDORID = v.VENDORID AND ADRSCODE LIKE '%PRIMARY%') AS vpa
OUTER APPLY (SELECT TOP 1 * FROM PM00300 vpa WHERE VENDORID = v.VENDORID AND ADRSCODE LIKE '%REMIT%') AS vra


I told you it was tasty! Bon Appétit!


Here are a few more articles on the topic:

http://msdn.microsoft.com/en-us/library/ms175156.aspx

http://www.sqlteam.com/article/using-cross-apply-in-sql-server-2005

http://decipherinfosys.wordpress.com/2007/10/08/apply-operator-in-sql-server-2005/

Friday, May 7, 2010

Goodbye, VB

I have known you since we were both pups, clicking away on my Commodore 64, one key at a time. (And who doesn't love the GOTO command?? So efficient!)

We spent hours creating silly applications in high school programming class on those cheezy Apple computers.

Sure, I got friendly with C++ in college on that horrid Sun Microsystems server, but you paid the bills when that psych major needed a windows application written for his senior thesis, and when those professors called at 10pm asking for help reformatting their manuscripts with Word Perfect for DOS macros.

You were reborn with new utility in Microsoft Office VBA, where I used you to help a company write some insane macros to help run their business. You also shined when I used you in the customization tools for Solomon IV, back in the day when we were both new to ERP.

Together we went on to build n-tier distributed web applications, eCommerce integrations, and custom desktop apps. When I came back to the Dynamics GP world after many years, your last classic version and I were reunited with VB 6 eConnect integrations that processed thousands of transactions faster than your newer .NET incarnation.

I then held your hand patiently as you shuffled along in GP's Modifier with VBA, and you occasionally stumbled or mumbled. And I held my nose as I watched you stuffed unceremoniously in that horrible VB Script window in Integration Manager, with no debugger, no syntax highlighting, and no Intellisense. The shame!

I finally, belatedly, met the new you after your .NET makeover, enjoying your new neighborhood of the .NET Framework, while still enjoying the familiarity of your friendly With and ReDim statements.

But a funny thing happened. A friend said that there was something about C# that just seemed to make it easier to read. Silly!, said I in retort. How could I possibly code faster or more efficiently or with more ease than with my pal, my buddy, my coding accomplice, good old VB.

I knew C++ and C# syntax, but why bother? They couldn't be better, just different. Just to show them, I dabbled in C#, writing a small GP VS Tools AddIn. What's the big deal? Sure, it works, but look how long it took me to remember to stop typing Dim and put the variable type in front! And seriously, semicolons are like white on white spread collars and bell bottoms! Every line? Can you not tell when a command ends?? Seriously???

So I tried another project, again to prove that there really was no difference. And then another.

And that's when things got really weird. When I came back to visit you to maintain some older code, things just didn't look right.

Sometimes your compiler let me leave off parentheses after a method, and other times it would add them on for me. Methods and properties became hazy. And why were your class constructors so odd? And then your code suddenly started to look messy, with all of those Dim and End If and As statements. It seemed like you had changed, or perhaps you just suddenly looked older to me.

Really it was me. After taking the red pill, things just weren't the same. Your code that was once simple and clear to me seemed convoluted and cluttered. C#, to my surprise, did seem cleaner and easier to read. It seemed more consistent and now made more sense. My cluttered code became simpler, more logical, cleaner, more efficient. Thinking in C# seemed to produce better classes, methods, and functions. Projects were being completed faster than I anticipated, with more code re-use and consistency across projects. Probably alot of coincidences, but it seems that my friend was right--there is a difference.

And then I noticed that you have been subjugated to the "Other Languages" option in Visual Studio 2008, which made me sad, but was perhaps another sign. And now I must move forward, alas, and leave you behind.

You've served me well, so I don't know what to say. I'm sure I'll be seeing you occasionally in code maintained for a few more years. And so I say goodbye, for now...

Coming soon! V++B#!

Wednesday, May 5, 2010

The Case Of The Pesky Window and Security

Maybe I am kidding myself when I think that everyone has been there. You define security in GP 10.0 (or GP 2010 for that matter), and it seems so simple. For example, the client wants to limit access to payroll data. So you use the standard roles, and you take away the Payroll Clerk role from all of them. All good right? So you test by logging in as the user, and sure enough you still see a variety of Payroll windows. Argh.

The Payroll Clerk role definitely takes care of Payroll transactions and cards, but not setup, utilities, routines, reports, Smartlist, or inquiries. Ugh. So you go hunting, and you find that the Accounting Manager role also has a variety of payroll related windows. So you adjust to remove those tasks and/or role from the users. All good right?

Not so fast, upon logging in you find that there is one stubborn remainder...Inquiry>>Payroll>>Check History. Blah. So you review the roles one more time, nothing obvious is popping out at you. What to do?

Well, there is actually a pretty fabulous KnowledgeBase FAQ on security. You will need access to PartnerSource to access it. The FAQ contains a section on security resource descriptions. To populate the tables involved, you need to complete the following steps per the FAQ:
  1. Click Microsoft Dynamics GP, point to Maintenance, and then click Clear Data to open the Clear Data window.
  2. On the Display menu, click Physical.
  3. In the Series list, click System.
  4. In the Tables pane, click the Security Resource Descriptions table, and then click Insert.
  5. Click OK.
  6. Click Yes.
  7. In the Report Destination window, select the Screen check box, and then click OK to send the report to the screen.
  8. Close the report.
Once you have completed the steps above, you can run the following script in SQL Server Management Studio to locate the Security Roles and Tasks associated with a particular window. Note the Display Name placeholder in the last part of the script, you need to substitute the display name of the window (the name that appears across the top of the window in Microsoft Dynamics GP).

SELECT ISNULL(A.SECURITYROLEID,'') AS SECURITYROLEID, ISNULL(M.SECURITYROLENAME,'') AS SECURITYROLENAME, --ISNULL(M.SECURITYROLEDESC,'') AS SECURITYROLEDESC, ISNULL(O.SECURITYTASKID,'') AS SECURITYTASKID, ISNULL(T.SECURITYTASKNAME,'') AS SECURITYTASKNAME, --ISNULL(T.SECURITYTASKDESC,'') AS SECURITYTASKDESC, R.PRODNAME, R.TYPESTR, R.DSPLNAME, R.RESTECHNAME, R.DICTID, R.SECRESTYPE, R.SECURITYIDFROM DYNAMICS.dbo.SY09400 RFULL JOIN DYNAMICS.dbo.SY10700 O ON R.DICTID = O.DICTID AND O.SECRESTYPE = R.SECRESTYPE AND O.SECURITYID = R.SECURITYIDFULL JOIN DYNAMICS.dbo.SY09000 T ON T.SECURITYTASKID = O.SECURITYTASKIDFULL JOIN DYNAMICS.dbo.SY10600 A ON A.SECURITYTASKID = T.SECURITYTASKIDFULL JOIN DYNAMICS.dbo.SY09100 M ON M.SECURITYROLEID = A.SECURITYROLEIDWHERE R.DSPLNAME = ''

So, so, so easy! I ran the script for the display name 'Payroll Check Inquiry' and received a list back of a handful of GL inquiry tasks that included the Payroll Check Inquiry window. Initially my reaction was 'Why would GL inquiry tasks include Payroll Check Inquiry?' But the more I thought about it, it makes sense from a drillback perspective. Just not from a business perspective for this particular client who does not want (even GL level users) to be able to view individual payroll check information. So, with the results of the script, I was able to eliminate those tasks for the appropriate users. Then I tested a final time, and was happy to report no payroll windows when I logged in!

I know, I know, I could avoid this by defining my own roles (and perhaps tasks), but in a simple situation like that I wanted to avoid the ground up approach if possible. And the script above is just one way to troubleshoot security when you are using and/or modifying the standard roles (or even custom roles you create).

On a side note, as I was working on this issue in GP 10.0 this week, I already was missing the functionality in GP2010 that allows you to copy role assignments from user to user, and the view filters on the security tasks and roles windows to display only selected. Funny how quickly I become attached to new functionality :)

Deep Dive Webinars for GP 2010

I have been involved in developing some deep dive webinars on GP 2010 for MCT and Partner learning. These webinars are meant to be focused on providing step by step guidance on configuring and using the new functionality in GP 2010. And, as an added bonus, all of these sessions were prepared and presented by Microsoft Certified Trainers (yay!) Check them out, hopefully they will help with your Partner readiness efforts for GP 2010!

Business Intelligence & Reporting (This one is mine!)
This demonstration focused presentation will walk through the setup and use of new enhancements involving reporting and business intelligence in Microsoft Dynamics GP 2010. Topics include Microsoft Word Templates, Excel Report Builder, Drilldown Builder, the Reporting Ledger, Emailing Reports, and SQL Reporting Services Metrics Integration

Foundations & Financial (Featuring my lovely coworker, Linda Brock)
This presentation will explore the new features found in GP2010 in the foundation series – including security, home page and navigation enhancements. Also included are explanation of the new features in General Ledger, Analytical Accounting, Payables Management, Receivables Management, Collections Management, and Encumbrance Management including step by step demonstrations.

Distribution & Manufacturing (Starring the fabulous John Chastain)
Microsoft Dynamics GP 2010 offers a number of enhancements in the areas of distribution and manufacturing. This session will walk through these capabilities with demonstrations, with a focus on the improved Purchase Order Returns functionality.

Human Resource & Payroll (Another one presented by wonderful Linda Brock)
This Human Resources and Payroll Deep Dive webinar covers both the new features added to the HRM Suite, including additions to Payroll, HR, Payroll Extensions, Advanced Payroll, PTO Manager, and Advanced HR. In addition to the new GP 2010 new features, this webinar also dives into some of the existing HRM Suite features that may be underutilized. Including Payroll Integration with Payables, Deductions in Arrears, Hours to GL, General Ledger Accruals, Advanced Labor Reporting and Certifications and Licensing.

Extender (This one belongs to crazy smart Steve Endow!)
This Deep Dive presentation reviews Extender and discusses how it can be used to extend the functionality of Microsoft Dynamics GP by creating custom forms and windows without any programming. It includes demonstrations that show how to create new windows in Microsoft Dynamics GP that can capture new master records, additional information, and transactions, and how to view the custom data using custom SmartLists.

Workflow (A shout-out to the extraordinary Joe Tews of Summit Group Software!)
This Deep Dive presentation will peer into the new Workflows and types of Workflow available with Dynamics GP 2010. Explore the new Action workflow, and check out the new Action and Approval workflows associated with Workflow for Dynamics GP 2010 through a series of in-depth demonstrations.

Still to come- Management Reporter as well as Business Intelligence and SharePoint for Microsoft Dynamics GP. Look for this to be added in mid-May. Happy learning!