Thursday, April 30, 2009

Excel Report-o-rama

I recognize that much of what I am about to say may be common knowledge to many of you out there. But as my story below outlines, sometimes we all need a little continued reinforcement :)

There are always underutilized features in new releases, but I think that Excel Reports has to top the list in Dynamics GP 10.0. I deploy them and train on them, but haven't found clients adopting them as I thought they would. Little did I know, I just had to be patient.

I remember when I was first learning Great Plains, and I would routinely look for a report whenever someone asked me for information. My coworker would constantly remind me that I could look in Smartlist, too, and that maybe it would have the information that was needed. So, eventually, I learned to adjust my thinking. So now when someone asks for information, I tend to look at Smartlist first, then Reports, and then the myriad of tools available to develop custom reports. I think a lot of users go through the same process of adjusting their thinking, and need to be reminded of the new functionality when the opportunity arises for them to practically apply it.

Excel Reports is a great example of this. Just in the past two weeks, I have had more than a couple situations arise where Excel Reports were the answer. It just took the reminder that they were there. We didn't have to build something custom, worry about licensing and training, set up a data connection ourselves, or use a static Smartlist.

And, here is the other bonus of Excel Reports...views. As the basis of Excel Reports, there are a great many views available (like AccountTransactions) which can be used for SQL reports as well. So, now I have another tool to add to my toolbox. Why reinvent the wheel? Just today I was able to leverage an existing view, and save the client time and money.

Same can be said of security, with the security roles that are created you can control access to the views for SQL reports as well as the Excel Reports. Because, remember, the Excel Reports can be accessed and refreshed by non-Dynamics users as well.

Here is an excellent FAQ on Excel Reports. Include in the FAQ is a link to the SQL security roles that are created, and how they map to the views and excel reports.

Happy weekend everyone, today is my Friday! :)

Monday, April 27, 2009

Automatically post batches in Dynamics GP: Post Master by Envisage Software

UPDATE:  Envisage Software has released Post Master Enterprise, a new product that runs as a Windows Service, allowing it to post Dynamics GP batches without requiring the GP client to be running or logged in.




I recently developed an eConnect SOP integration that needed to import up to 40,000 orders a day into Dynamics GP from an e-commerce web site.

At first this seemed like it might be a challenge due to the sheer order volume, but based on some tests I performed with sample data, I found that eConnect was able to import over 500 transactions per minute. This import speed would be more than adequate to handle 40,000 orders throughout a day.

But, there was a second challenge. Although I could easily import 40,000 orders into GP throughout the day, the client would have to post those order batches in GP. And posting a SOP order in GP takes quite a bit longer than importing the orders using eConnect. Additionally, because the orders were coming from an existing operational system, they did not need to be reviewed in GP prior to posting, so there was no need for human review and manual posting.

To accommodate the posting of the SOP batches, I searched for an auto-posting solution for Dynamics GP. I found a handful of auto-post add-ons for GP, but the one that impressed me the most was Post Master by Envisage Software.

To my surprise, Post Master is written entirely in .NET and does not install a custom Dex dictionary--just a few DLLs in the AddIns folder. Andrew Dean, the developer of Post Master, explained that he was able to push Visual Studio Tools to its limits, allowing him to manage the entire posting process without writing any Dex code.

Because it is written in .NET, one great feature is that an icon is displayed in the system tray, allowing you to see that Post Master is running, see the agent status, and start or stop the agent.








As someone who has become accustomed to the 'self-contained' nature of GP development, I thought the system tray icon was a brilliant idea, as it allowed me to quickly assess the status of the posting process by quickly glancing at the system tray, and without having to open menus or windows within GP.

A second key feature of Post Master that makes it shine is that it allows you to specify whether or not you wish to generate posting reports for each batch source, and allows you to select which reports you want to generate. For each batch type, you can have each of the posting reports saved to a file, e-mailed to a user, or you can turn off any of the posting reports for that batch source.




Another key feature for GP customers that are looking to automate batch posting for different types of batches that may be imported or manually entered, Post Master offers an Auto-Detect feature that will automatically post batches based on batch source and batch ID filter that you specify.



So if you have a client that wants accurate data throughout the day, doesn't want batches piling up, and doesn't want to have to check the series or master post windows regularly, Post Master will automatically process batches as they are created throughout the day.

Finally, Post Master includes a key feature that will be highly valued by anyone that has had to support an automated or scheduled integration or process in GP. It has a separate monitor agent that will detect if the GP posting process has stopped responding, or if Post Master is unable to perform the posting process, and will notify a system administrator via e-mail.

There are several other very nice features that round out the product, but I wanted to highlight these key benefits that I thought were most compelling.

If you have a client with high transaction or batch volume, or a client that has to regularly sit around and wait for large batches to post, I'd definitely recommend checking out Post Master.


Update:  Given the success of Post Master for Dynamics GP, I am now working with Envisage Software to resell Post Master in North America.  If you have any questions regarding Post Master or would like a trial version, you can contact Envisage Software Solutions, or contact me directly.


Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional.  He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

http://www.precipioservices.com

setup.exe: Windows cannot access the specified device, path or file.

While trying to install a .NET eConnect integration on a client's test server, I ran into this error message when I tried double clicking on setup.exe:

"Windows cannot access the specified device, path or file."

Neither the setup.exe nor the MSI install file would launch.

Because the error is so generic, it was difficult to know where to look for the cause.

The user was logged into the server as a local Administrator, and the setup.exe was in C:\Temp, so it was difficult to believe that it was a file or directory permission issue. As a test, I tried copying the setup files from the file server to the desktop, but that didn't work either.

After Googling the error for a few minutes, I finally came across this article with the solution.

It seems that Windows Server 2003 has a feature that, under some specific set of circumstances, "blocks" a file that has been copied from another machine. I've copied plenty of files from one machine to another and never run into this issue (and never even knew the "block" feature existed), so I'm not sure what specific situations or conditions cause the file to be blocked.

But, sure enough, when I right clicked on the setup.exe file, at the bottom of the General tab I saw a message indicating that the file was being blocked. I clicked on the Unblock button for both the setup.exe and MSI files, and the install proceeded without issue.

Monday, April 20, 2009

Things I love about SQL Reporting Services

So while Steve has been prolific in his posts over the past month, I have to admit I have been more than a little deficient. I could blame travel (this makes week 5 in a row, albeit small trips) or workload (highly unlikely), but if I were honest I would blame the Spring sunshine for keeping me outside instead of at the computer :)



I have had SQL Reporting Services on the brain lately, as I have been working on a few client projects, and have also been prepping to teach "Using Microsoft SQL Server Reporting Services with Microsoft Dynamics GP 10.0" for partners next week in Seattle. Although I taught the first run of the class last May, enough time has passed that I wanted to work through the labs again and refamiliarize myself with the layout of the class.



Of course, every time I do this I am reminded of things that I do not use regularly in SQL Reporting Services, but should. There are some fabulous SQL Reporting Services blogs/sites out there including one of my favorites...


SSW Rules to better SQL Reporting Services

So, I thought I would add to the myriad of sources out there with my top five SQL Reporting Services tips (underused by me, of course):


Me. Value

  • Used to quickly reference the current field in a formula, for conditional formatting, visibility, etc for example: =IIF (Me.Value>0, false, true)
  • Saves so much time, as well as spelling mistakes!

Report Items

  • To put data fields in to page headers or page footers, you can use the Report Items collection to reference the text box name.
  • For example, you put document number in to the body of an invoice report. Right-click on the field, and choose properties. Note the name (for our example, "TextBoxName"), then on your page header/footer drag on a text box and enter the following: =ReportItems!TextBoxName.Value

Cascading Parameters

  • So, you have a report that you want to be able to restrict by sales territory and salesperson. But you want to make sure that the salesperson parameter list only displays the salespeople for the sales territory parameter selected. What to do? Set up cascading parameters!

Create two new datasets on the Data tab:

  • SELECT * FROM RM00303 (Sales Territory Master)
  • SELECT * FROM RM00301 WHERE SALSTERR=@TERRITORY (Salesperson Master, with parameter)

You original dataset should have the parameters of @TERRITORY and @SALESPERSON defined in the WHERE clause. Make sure they are spelled the same way.

Now, you need to define the parameters, Reports>>Report Parameters

  • Make sure that the parameters are listed in dependent order, @TERRITORY should be first.
  • For each parameter, set Available Values to From Query and select the corresponding data set, value field, and label field.

Run your report, select the first parameter and notice the second parameter is now a restricted list. For more information on cascading parameters, click here.


Download the Dynamics GP SQL Report Source Files

  • You can download the visual studio project files for the Dynamics GP SQL Reports to make your own modifications, they are also a great learning resource.
  • Download from PartnerSource


Creating Perspectives for Report Models

  • Report models allow you to enable end users to perform ad-hoc reporting by providing them with a logical interface to the database tables. Report models can include multiple tables, relationships, and calculated attributes (fields).
  • To be efficient, you may find that you build one report model that actually serves a number of audiences that all work with a subset of the model. This is where perspectives can be helpful. You can define multiple perspectives within a report model, that present a subset of the report model to the end user.
  • For example, let's say that I have a report model that includes purchasing information including purchase orders, payables checks, vendor information, item information, and distribution information. The purchasing team may only be interested in purchase orders, vendor information, and item information while the finance team may be most interested in purchase orders, payables checks, and distribution information. You can set up two different perspectives to present these subsets of information to the end-users, making it easier for them to work with model because they do not have to ignore the additional items that do not apply to them.

To create perspectives:

  • Open your report model in Visual Studio
  • Select Model in the pane to the left of the entity list
  • Right click and choose New>>Perspective
  • By default all entities and attributes will be marked to be included in the perspective, unmark the items that should be excluded
  • Click OK to save
  • Repeat as necessary

When users select a Report Model in Report Builder, then can expand the report model to view any perspectives that have been defined for the model.

So, that is my wonderful list. How about yours? Feel free to share your tips and tricks, and I will update the post. In the meantime, happy reporting...


Thursday, April 16, 2009

More than I ever wanted to know about eConnect and MSDTC

In my last post, I discussed the eConnect error that returned "The transaction has already been implicitly or explicitly committed or aborted."

I found that using the Enlist=False parameter on the eConnect transaction string would sometimes make the error go away, but at the risk of incomplete transaction handling.

The second possibility was that the MSDTC Transaction Manager Communication security setting needed to be changed to "No Authentication Required."

And the third possibility was that a firewall or network configuration issue was causing a problem.

Today I worked with the client to troubleshoot the issue, and it turns out that the Enlist=False trick no longer worked. We were now receiving the error with Enlist=False. I then tried setting both the server and client workstation to No Authentication Required. But that didn't work.

So, on to Plan C, to research what possible network configuration issues could cause the error.

While reading various forums, I came across some links to two free Microsoft tools that help diagnose issues with DTC network communications: DTCPing and DTCTester

DTCPing tests several lower level types of network communication between two machines to diagnose possible configuration issues that prevent DTC from working properly.

DTCTester operates at a higher level and connects to SQL Server, creates a temp table, creates a transaction as it inserts data, and then commits the transaction, utilizing DTC in the process.

Before testing it in the client's environment, I tested it on my development machines. Between two Windows Server 2003 HyperV virtual server, after a few tweaks and adjustments, I was able to get DTCPing to work fine, showing proper communication between the two servers.

When I tested it between my physical Windows XP workstation and a Windows 2003 HyperV virtual server, I was able to get some communication, but ultimately received an error.

On my XP workstation, I got:

RPC server is ready
Please Start Partner DTCping before pinging
++++++++++++Validating Remote Computer Name++++++++++++
Invoking RPC method on server1
RPC test is successful
++++++++++++RPC test completed+++++++++++++++
++++++++++++Start DTC Binding Test +++++++++++++
Trying Bind to server1
Binding call to server1 Failed
Session Down


And on the server, I got:

RPC server is ready
Please Start Partner DTCping before pinging
Please hit PING button to complete the test
++++++++++++Start Reverse Bind Test+++++++++++++
Received Bind call from XP
Trying Reverse Bind to XP
Error(0x5) at ServerManager.cpp @453
-->RPC reverse BIND failed
-->5(Access is denied.)
Reverse Binding to XP Failed
Session Down


Even after turning off all firewalls, verifying IP addresses, and checking configurations, I don't know the cause of the Access denied message. But since I knew two of my servers could communicate without errors, I tried the DTCPing tool in the client environment.

UPDATE: Reader 'Maxim' provided a solution for the Access Denied error. In HKEY_LOCAL_MACHINE\ Software\Policies\Microsoft\Windows NT\RPC, add or change the DWORD value "RestrictRemoteClients" = 0. This is apparently a network security / firewall feature that was enabled in Windows XP SP2 to restrict anonymous RPC access to a machine.

On the XP client machine, the DTCPing had no errors, but when I ran it on the server, trying to connect to the client, I received the following error:

Error(0xB7) at nameping.cpp @43
-->gethostbyname failure

I then tried to ping the client from the server, and sure enough, no response. After checking the IP addresses, I found that the server was resolving the client workstation name to the wrong IP address, presumably the result of the network move that had taken place recently. It appeared that the internal DNS server had not been updated with the new IP address.

As a temporarily solution, I added a record to the hosts file under C:\Windows\System32\drivers\etc, listing the actual IP address of the workstation along with its fully qualified domain name (i.e. XP.DOMAIN.local). I then ran nbtstat -R to purge the name resolution cache, and then performed a ping to confirm that the server would now use the proper IP address.

I wasn't sure if that was the cause of the problem, so I went ahead and tested with DTCTester, which completed its test successfully.

We then ran the eConnect integration, and it appeared to work fine--no more transaction errors!

Although it seems like an odd cause of the issue, I'm hopeful that the IP address fix was the solution and that the error will no longer occur. But the client has at least 35,000 more transactions to import, so I'll find out soon enough.

Tuesday, April 14, 2009

eConnect Error: "The transaction has already been implicitly or explicitly committed or aborted."

A client has been using an eConnect integration for several months now without issue. Recently, the client had to move their servers to a new data center, and now for some reason, the following error occurred when the integration attempted to create a new GL account:

"The transaction has already been implicitly or explicitly committed or aborted."

The error causes an exception to be thrown during the eConnect_EntryPoint call, and offers no other information.

I dont' yet fully understand the underlying cause of the error, but have come across two possible solutions.

The first, and apparently most common means of solving the problem is to add "Enlist=False" to the eConnect connection string. Unfortunately, I haven't been able to find a good explanation of the exact mechanics of how this parameter affects the connections and connection pooling with DTC and COM, but this modifies the way that eConnect uses connections and invokes transactions.

This approach did make the error go away, but after reading more on the issue, it seems that this solution only eliminates the error, but does not solve the underlying problem. By disabling transactions, it can possibly expose you to partial transactions and corrupt data.

The second solution is discussed on this forum thread. The last two posts mention that changing the MS DTC Security Settings to "No Authentication Required" may be the preferred method for resolving the underlying problem.

That particular authentication setting, under the "Transaction Manager Communication" settings, is typically set based on your network configuration. The eConnect Install and Admin guide offers the following instructions:


Configuring DTC

If the two computers are in the same domain, the default configuration for DTC can be used with eConnect. If you have made modifications to the security configuration for DTC, you must be sure the following settings are used:

  • Network DTC Access enabled
  • Allow Inbound communications
  • Allow Outbound communications
  • Mutual Authentication Required (when running in a domain environment)
  • No Authentication Required (when running in a Windows workgroup environment or the client machine is pre-Windows XP SP2)

If the two computers are in a Windows workgroup, or are in domains that do not have an established trust relationship, update your DTC security configuration to use No Authentication Required.


I've highlighted the key points in red. When you are running in a single domain environment, or in a multi-domain environment where a trust has been established, Mutual Authentication should work. But you will otherwise need to use No Authentication Required.

Finally, the same thread has a pointer over to an MSDN forum which discusses the issue in a non-eConnect context, and indicates that a firewall may cause DTC communication to be blocked.

Given what I've read, my current assumption is that when my client's servers were moved to the new data center, a new domain relationship was setup, or a network configuration change occurred that caused the DTC communication to stop functioning properly. The client workstation and GP server both are set to "Mutual Authentication Required", so this could be the real issue.

Later this week I'm going to modify my code to remove the Enlist=False connection parameter and switch both machines to No Authentication Required and see if that works.

Reset GP During AddIn Testing

Dynamics GP AddIns are great. You can use Visual Studio and your preferred .NET language to create powerful enhancements for Dynamics GP. And the AddIns are neatly packaged in a DLL file that you simply need to copy into the GP AddIns folder.

But, of course, there are some downsides, one of which is testing and debugging. One issue is that if you need to modify your code, you need to recompile the DLL file, close GP, copy the new DLL file, then re-launch GP. It's tedious and time consuming.

If you aren't accessing any GP windows or Dynamics objects, you can write a wrapper EXE to test your DLL independent of GP, but if you are using Dynamics objects, you need to update your DLL and test in GP. After closing and launching GP a dozen times, you'll be fed up.

Following on my solution for resetting GP when testing with Integration Manager, I've created a very similar solution for testing GP AddIn DLLs.

I needed the following:
  1. Close GP
  2. Copy my new DLL to the AddIns folder
  3. Launch GP and login again
For this particular project, I'm developing on Windows 2008 with SQL 2008 and GP 10, so I noticed a few things behaved differently and had to make a few minor differences to make my prior script work. I didn't note all of the differences, but the primary issue was getting GP to automatically launch and run the login macro using the Start command, which I'm guessing is something about Windows 2008.

You will want to complete all of the same preparatory steps in my prior post, and then use the scripts below.

I used the following batch file:

C:\pskill dynamics.exe

sqlcmd -S SQL2008\GP10 -E -Q "DELETE FROM DYNAMICS.dbo.ACTIVITY WHERE USERID = 'sa'"

xcopy "C:\Users\sendow\Documents\Visual Studio 2005\Projects\MyAddIn\bin\Debug\MyAddIn.dll" "C:\Program Files\Microsoft Dynamics\GP10\AddIns\" /Y


start Dynamics.exe "C:\Program Files\Microsoft Dynamics\GP10\Dynamics.set" C:\Login.mac



And here is the login.mac file that worked for me: (change the password and company ID to suit your environment)

Logging file 'none.txt'
CheckActiveWin dictionary 'default' form Login window Login
MoveTo field 'User ID'
TypeTo field 'User ID' , 'sa'
MoveTo field Password
TypeTo field Password , 'password'
CheckActiveWin dictionary 'default' form Login window Login
MoveTo field 'OK Button'
ClickHit field 'OK Button'
NewActiveWin dictionary 'default' form 'Switch Company' window 'Switch Company'
ClickHit field '(L) Company Names' item 1 # 'Fabrikam'
MoveTo field 'OK Button'
ClickHit field 'OK Button'
NewActiveWin dictionary 'default' form sheLL window sheLL
NewActiveWin dictionary 'default' form sheLL window sheLL
ActivateWindow dictionary 'default' form sheLL window sheLL



Notice that on the last line of the batch file, I had to specify the full path to the Dynamics.set file, even though it is in the PATH variable, just like Dynamics.exe.

Once the batch file has been created, just create a shortcut on your desktop or a shortcut in your QuickLaunch toolbar.

Now with a single or double click, GP is closed, your recompiled DLL is copied to the AddIns folder, and then GP is launched.

It's not quite as easy as just clicking Run in Visual Studio and debugging directly, but it saves time and alot of hassle.

UPDATE: One caveat that I forgot to mention on my prior post as well as this one: You will want to make sure and close all GP windows before running the batch file, otherwise you can end up with invalid records in your DEX_LOCK table. If you don't want to close the windows, you could add an additional line to the batch file to delete the records in DEX_LOCK.

Of course, these scripts and this process are intended only for use on an isolated development workstation and should not be used in production.

Monday, April 13, 2009

Experts Exchange

Have you ever been working on a project and run into a technical challenge / hurdle / problem that exceeded your knowledge or skill in a specific area? At those times, you wish that you could pick up the phone and have an hour, or maybe just a few minutes with an expert that can get you past the problem and back on track. Sometimes, you would be willing to pay a fair amount of money to have access to that expert right away, and get a quick response.

Several months ago I was asked to develop a solution that would allow a client to export 401k data from Dynamics GP payroll so that the client could submit the data to a 3rd party benefits administrator. Not a problem in concept, as I've done it several times before. The initial query came together quickly, but then I ran into something that I had never needed to do with SQL.

The export needed to indicate whether employees had 401k loan repayments. If the employee had a 401k loan, I needed to display the repayment amount amount deducted from each pay check. That requirement wasn't difficult, but employees could be repaying multiple loans, so the requirement to display up to 5 different 401k loan deductions on a single paycheck (output a single row) as 5 separate columns was no easy task.

One challenge was that the loan deductions were recorded using many different pay codes, all of which started with "LN". The second challenge was that a given pay check may have anywhere from 0 to 5 loan repayment deductions. And to top it off, I had to figure out a way to transpose an arbitrary number of rows into 5 fixed columns. And don't forget that I needed to be able to include this in an existing query that was exporting dozens of other employee and pay check related fields.

So there I was, stuck, wondering how in the world I would figure this out without completely blowing my very small budget. I needed some help, I needed it quickly, and I didn't want to pay hundreds of dollars to solve this very specific problem.

I then remembered Experts Exchange. This is an amazing member-based web site that allows you to post very detailed questions about your specific problem, and offer "points" to site members if they can help you solve your problem. Most problems receive a response in 3-4 hours, but you can increase the points that you offer depending on the complexity or urgency of your issue. Of course, you must purchase points as part of a monthly fee, and you may need to buy extra points if you run out or need to offer extra points.

When I created and posted my question, I offered 500 points (the equivalent of $6.50), and in less than 3 hours, I received two amazing responses, both of which solved my problem using two different approaches. I was able to copy and paste the code right into the middle of my query, change the table names, and it worked perfectly.

The solution I chose used at least two SQL 2005 T-SQL techniques that I had never used before, and would have taken me many hours of trial and error to produce even if I were given some initial hints. The help from a SQL expert saved my budget, and helped me deliver a more efficient and effective solution than I would have been able to deliver.

There are a few caveats, of course. I have since learned that my wonderful first experience might not have been typical. A few questions I have posted have received mediocre or mixed responses, and a few obscure ones have received no valuable responses. But for mainstream issues like SQL, .NET development, Windows issues, common networking issues, etc., there are some amazing experts available to help. I do see that there are many questions regarding Dynamics GP and eConnect as well.

And with each question that is solved, the massive knowledge base provided by the web site continues to grow. Many times you can simply search the existing knowledge base and find the answer to your issue, with valid solutions pre-screened and tested for you.

If you tend to run into alot of different technical issues at various client sites, I would recommend checking out Experts Exchange to see if the problem has already been solved.

It looks like they currently offer a monthly plan for $12.95 a month, with discounts if you subscribe for 6 months or more. The way I look at it, if a solution can help me avoid going over my project budget, the $13 is well worth the investment.

Searching for SQL Objects and Data

If you've spent much time working with Dynamics GP and SQL Server, you've likely needed to search for a table, stored procedure, table column, or specific data value. Many times while tracing a problem with a 3rd party module, I've had to find a table, but didn't remember the table name or the database in which the object was stored.

There are several ways to search SQL Server for an object. Because I find it the simplest to remember, and think its results are the easiest to read, I have used the INFORMATION_SCHEMA views for many years. I typically know whether I'm searching for a table vs. stored procedure vs. field, so one the INFORMATION_SCHEMA views usually work fine for me.

Other people prefer to use sysobjects, now called sys.objects in the current versions of SQL Server. (sysobjects was deprecated in SQL 2005, and should be replaced with sys.objects) sys.objects is handy if you don't know the type of object you are looking for, or if you want to search for objects regardless of type.

If you frequently need to search for objects, a reader of SQL Server Magazine has contributed this handy script that allows you to search across all objects across all databases in your SQL instance based on a partial name.

A few times I have been in a situation where I needed to find all records in all tables in a database that contained a specific value, independent of field name. In those cases, it is possible to search all fields in all tables for a given value. It may take quite a while to search a large database and may return alot of data, so obviously it isn't something you want to do on a regular basis, but it can be a life saver when you need it.