I recently was asked to help create Tax Schedules in Dynamics GP based on a spreadsheet of customers, zip codes, and corresponding city, state, and local tax rates. I created a series of scripts to load the information from Excel, and I thought I would share these with you all in case anyone else could benefit from them :) Now, in this case, the tax details were already loaded to Dynamics GP and the spreadsheet referenced the Tax Detail IDs.
Of course, big disclaimer, that you should always test, test, test, and backup, backup, backup when using these scripts. I created a series of these scripts using the concatenate function in Microsoft Excel to pull in the data needed. These scripts also assume that the tax schedules in GP have IDs that correspond to the customer zip codes.
--update descriptions for tax details, TX00201, as they were loaded with a generic ID
Update TX00201 Set TXDTLDSC='locality/tax detail description' where TAXDTLID='tax detail ID'
--create TX00102, Tax Schedule relationship to Tax Detail
--assumes static values for TXDTLBSE, TDTAXTAX, Auto_Calculate fields, if in doubt that these are correct for your situation, manually set up a tax detail and check the TX00102 to confirm the proper values
INSERT INTO TX00102 (TAXSCHID, TAXDTLID, TXDTLBSE, TDTAXTAX, Auto_Calculate) values ('zipcode/tax schedule ID', 'taxdetail ID',3,0,0)
--create TX00101, Tax Schedule records
INSERT INTO TX00101 (TAXSCHID, TXSCHDSC) values ('zipcode/tax schedule ID','area description/tax schedule description')
--update customer master tax schedule where the zip code (left 5 characters) match a tax schedule in GP
update RM00101 set rm00101.TAXSCHID=tx00101.TAXSCHID from RM00101 inner join TX00101 on left(RM00101.ZIP,5)=tx00101.TAXSCHID
--update customer master address tax schedule where the zip code (left 5 characters) match a tax schedule in GP
update RM00102 set rm00102.TAXSCHID=tx00101.TAXSCHID from RM00102 inner join TX00101 on LEFT(rm00102.ZIP,5)=tx00101.TAXSCHID
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
My blog has moved! Please visit the new blog at: https://blog.steveendow.com/ I will no longer be posting to Dynamics GP Land, and all new posts will be at https://blog.steveendow.com Thanks!
Wednesday, February 29, 2012
Thursday, February 23, 2012
Forecaster Database Transfer Tool Permissions
A shout-out to Jake Friedl at Microsoft for pointing us in the right direction on this one. If you are using Forecaster with Management Reporter, you probably have been faced with using the Forecaster Database Transfer Tool to move the budget information from Forecaster to Microsoft Dynamics GP so that you can report on it out of Management Reporter.
Generally, this works great and is pretty easy to use. However, the need to run the tool on a regular basis to get the latest numbers on reports makes me miss FRx and DirectLink-- I cannot believe I just admitted to that :) It may make sense to give your budget administrators permissions to use the Transfer Tool to update the balances whenever they need. Since the tool starts out with asking you for credentials for both the Forecaster and the Dynamics GP databases, you will need to set up the proper permissions in SQL Server for a non-administrator user to run the tool successfully.
Since these steps involved modifying stored procedures as well as permissions, it is recommended that you are comfortable with SQL Server before proceeding and have recent backups of your Dynamics GP company and Forecaster databases.
Also, another side note, the first time you access the Forecaster Database Transfer Tool, there are some SQL Server Objects that need to be created. So, the first user who uses the tool must have dbo or db_ddladmin roles in both the Dynamics GP and Forecaster databases. Otherwise, the user can use the Create to File function to send the scripts to an administrator to be run. I am NOT recommending that you give either of these roles to a non-IT administrator, just that you might want to have an IT/SQL administrator run through the transfer the first time to avoid these issues.
So, let’s start with the GP side of things….
1. The user login that will be used to access the Forecaster Data Transfer Tool needs to set up as a user in the SQL Server under Security-Logins. This can be a windows user, or a specific SQL user, depending on how you want to administer the account. It CANNOT be an existing Dynamics GP login. When you set up the user, they must be assigned to your Dynamics GP company database with the DYNGRP role in the User Mapping section of the Login Properties window.
2. You need to modify the stored procedure FCBUDGETIMPORT in your Dynamics GP company database to comment out the last section. To do this, expand your Dynamics GP company-Programmability-Stored Procedures and locate FCBUDGETIMPORT. Right-click and choose Modify. Once you comment out the last section as noted in the screenshot below, click the Execute icon at the top of the Query window. This modification ensures that an error will not be produced each time the stored procedure is executed.
3. Update the permissions for the DYNGRP role by running the following scripts, substitute your Dynamics GP company database for [GPDatabase] in the first two scripts.
USE [GPDatabase] GRANT ALTER ON dbo.EE_GL00201 to DYNGRP
USE [GPDatabase] GRANT EXECUTE ON dbo.FCBUDGETIMPORT to DYNGRP
USE DYNAMICS GRANT ALTER ON dbo.SA_Forecaster7ImportErrorLog to DYNGRP
On the Forecaster side, things are a bit simpler…
1. The user must be added to the Forecaster database with at least the db_datareader role. This is done by expanding Security-Logins-Right-click on user and choose Properties. Database and role assignment is in the User Mapping section.
Good luck! Let me know if you run in to any issues, and thanks again to Jake for sharing the scoop!
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Generally, this works great and is pretty easy to use. However, the need to run the tool on a regular basis to get the latest numbers on reports makes me miss FRx and DirectLink-- I cannot believe I just admitted to that :) It may make sense to give your budget administrators permissions to use the Transfer Tool to update the balances whenever they need. Since the tool starts out with asking you for credentials for both the Forecaster and the Dynamics GP databases, you will need to set up the proper permissions in SQL Server for a non-administrator user to run the tool successfully.
Since these steps involved modifying stored procedures as well as permissions, it is recommended that you are comfortable with SQL Server before proceeding and have recent backups of your Dynamics GP company and Forecaster databases.
Also, another side note, the first time you access the Forecaster Database Transfer Tool, there are some SQL Server Objects that need to be created. So, the first user who uses the tool must have dbo or db_ddladmin roles in both the Dynamics GP and Forecaster databases. Otherwise, the user can use the Create to File function to send the scripts to an administrator to be run. I am NOT recommending that you give either of these roles to a non-IT administrator, just that you might want to have an IT/SQL administrator run through the transfer the first time to avoid these issues.
So, let’s start with the GP side of things….
1. The user login that will be used to access the Forecaster Data Transfer Tool needs to set up as a user in the SQL Server under Security-Logins. This can be a windows user, or a specific SQL user, depending on how you want to administer the account. It CANNOT be an existing Dynamics GP login. When you set up the user, they must be assigned to your Dynamics GP company database with the DYNGRP role in the User Mapping section of the Login Properties window.
2. You need to modify the stored procedure FCBUDGETIMPORT in your Dynamics GP company database to comment out the last section. To do this, expand your Dynamics GP company-Programmability-Stored Procedures and locate FCBUDGETIMPORT. Right-click and choose Modify. Once you comment out the last section as noted in the screenshot below, click the Execute icon at the top of the Query window. This modification ensures that an error will not be produced each time the stored procedure is executed.
USE [GPDatabase] GRANT ALTER ON dbo.EE_GL00201 to DYNGRP
USE [GPDatabase] GRANT EXECUTE ON dbo.FCBUDGETIMPORT to DYNGRP
USE DYNAMICS GRANT ALTER ON dbo.SA_Forecaster7ImportErrorLog to DYNGRP
On the Forecaster side, things are a bit simpler…
1. The user must be added to the Forecaster database with at least the db_datareader role. This is done by expanding Security-Logins-Right-click on user and choose Properties. Database and role assignment is in the User Mapping section.
Good luck! Let me know if you run in to any issues, and thanks again to Jake for sharing the scoop!
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Friday, February 10, 2012
Update on Management Reporter Dimension Sets and AX
A quick update on an earlier post, http://dynamicsgpland.blogspot.com/2012/02/management-reporter-dimension-sets-and.html, on an issue with Dimension Sets not working when using Management Reporter with AX2012. Turns out, it is now an official quality report and should be resolved in MR2012 coming in March. Let's keep our fingers crossed!
The combination of not being able to use Dimension Sets and the 512 character limitation for the string created by the dimensions (e.g., [Main Account}= 200050) in a row, column, or tree...it makes for some interesting report design (lots of individual NP rows followed by a total) when you are working with a large/complex chart of accounts.
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
The combination of not being able to use Dimension Sets and the 512 character limitation for the string created by the dimensions (e.g., [Main Account}= 200050) in a row, column, or tree...it makes for some interesting report design (lots of individual NP rows followed by a total) when you are working with a large/complex chart of accounts.
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Thursday, February 9, 2012
Importing 725,000 Records Into Dynamics GP In 30 Seconds
At the MS Dynamics World Decisions Fall 2011 virtual conference, I gave a presentation titled Importing 10 Million Transactions Into Dynamics GP. While the project discussed in that presentation was challenging, I was recently faced with a slightly different high-volume integration challenge.
The client works with several large trading partners, each of which provides a consolidated monthly payment, along with a data file that is effectively the remittance advice for the payment, indicating how the check should be applied to thousands of customers. But this isn't a typical cash receipt. A single payment may be up to $4 million, and the remittance advice is a 140 megabyte text file containing 725,000 rows of detailed data relating to thousands of invoices. And that is just one of their trading partners.
The accounting staff have been manually processing these massive data files in Excel, manually entering thousands of transactions each month, then manually applying the payments to outstanding invoices, and then manually entering adjustments or memos for over-payments or under-payments. Due to the highly detailed format of the remittance data file and quirks in the data, there was no way to directly import the file using a tool like Integration Manager, and even if they did manage to import the cash receipts, they would still have to apply the payments and manually enter the adjustments.
Although I have developed automated integrations that can process tens of thousands of transactions over the course of a day, I don't think I've ever had to process hundreds of thousands of records at once--while a user was waiting and watching the user interface. It was obvious that performance was going to be a challenge.
I thought about a few different designs, and came to the conclusion that the best approach would be to get the raw data into a SQL table first, and then process it further from there. I briefly considered loading the entire file into memory and then import it, but I need to perform several operations and transformations on the data that would be impractical and inefficient using ADO.NET.
So, I need to load a 140 megabyte CSV file into SQL Server programmatically. Although I loved DTS, I am not a fan of SSIS, so programmatically running an SSIS job to bring the file in did not appeal to me--not worth the hassle. However, I do have a C# class that does an excellent job of parsing delimited text files, and is blazing fast (it can load the 140MB CSV file into memory in 2 seconds). So, I can read the text file and get it into a DataTable in memory, but then what? How do I push that massive data table into SQL?
Performing over 700,000 separate insert statements is clearly out of the question--I didn't even bother to test it since I knew that it would take far too long. I needed something faster...much faster...
After doing a little searching, I came across the .NET SqlClient SqlBulkCopy class. Faaaaantastic. All of the brains of the famous bcp utility, but in a stylish .NET package. Just 5 lines of code.
With those few magical lines of code, SqlBulkCopy pushes all 725,000 records into SQL Server in 30 seconds (on my relatively modest virtual server).
Because the client (thankfully) doesn't want to actually import 725,000 separate cash receipts, they summarize the data by customer and month, which results in 2,100 summary cash receipts. Because I have the data in SQL now, I'm able to use a query to summarize the data. But even there I run into a performance issue.
Because there is so much data to summarize, my query initially took several minutes to run, and sometimes even caused my .NET app to receive a command timeout error. Having read many articles about SQL query optimization, I knew that my best bet was to create a covering index on my data table. Sure enough, after adding the covering index, the query dropped to 4 seconds.
So in roughly 45 seconds, the 140MB file is loaded and converted into 2,100 cash receipt transactions, ready to be handed off to eConnect. eConnect then becomes the bottleneck, taking a few minutes to create the cash receipts, apply the payments to invoices, and then create debit and credit memos for any over or under payments.
Needless to say the customer is amazed and thrilled with the new solution. Compared to the manual process that they have been performing to get the data into GP, the custom integration is like magic.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
The client works with several large trading partners, each of which provides a consolidated monthly payment, along with a data file that is effectively the remittance advice for the payment, indicating how the check should be applied to thousands of customers. But this isn't a typical cash receipt. A single payment may be up to $4 million, and the remittance advice is a 140 megabyte text file containing 725,000 rows of detailed data relating to thousands of invoices. And that is just one of their trading partners.
The accounting staff have been manually processing these massive data files in Excel, manually entering thousands of transactions each month, then manually applying the payments to outstanding invoices, and then manually entering adjustments or memos for over-payments or under-payments. Due to the highly detailed format of the remittance data file and quirks in the data, there was no way to directly import the file using a tool like Integration Manager, and even if they did manage to import the cash receipts, they would still have to apply the payments and manually enter the adjustments.
Although I have developed automated integrations that can process tens of thousands of transactions over the course of a day, I don't think I've ever had to process hundreds of thousands of records at once--while a user was waiting and watching the user interface. It was obvious that performance was going to be a challenge.
I thought about a few different designs, and came to the conclusion that the best approach would be to get the raw data into a SQL table first, and then process it further from there. I briefly considered loading the entire file into memory and then import it, but I need to perform several operations and transformations on the data that would be impractical and inefficient using ADO.NET.
So, I need to load a 140 megabyte CSV file into SQL Server programmatically. Although I loved DTS, I am not a fan of SSIS, so programmatically running an SSIS job to bring the file in did not appeal to me--not worth the hassle. However, I do have a C# class that does an excellent job of parsing delimited text files, and is blazing fast (it can load the 140MB CSV file into memory in 2 seconds). So, I can read the text file and get it into a DataTable in memory, but then what? How do I push that massive data table into SQL?
Performing over 700,000 separate insert statements is clearly out of the question--I didn't even bother to test it since I knew that it would take far too long. I needed something faster...much faster...
After doing a little searching, I came across the .NET SqlClient SqlBulkCopy class. Faaaaantastic. All of the brains of the famous bcp utility, but in a stylish .NET package. Just 5 lines of code.
using (System.Data.SqlClient.SqlBulkCopy bulkcopy = new System.Data.SqlClient.SqlBulkCopy(gpConn))
{
bulkcopy.BulkCopyTimeout = 300;
bulkcopy.DestinationTableName = tableName;
bulkcopy.WriteToServer(dataTable);
gpConn.Close();
}
With those few magical lines of code, SqlBulkCopy pushes all 725,000 records into SQL Server in 30 seconds (on my relatively modest virtual server).
Because the client (thankfully) doesn't want to actually import 725,000 separate cash receipts, they summarize the data by customer and month, which results in 2,100 summary cash receipts. Because I have the data in SQL now, I'm able to use a query to summarize the data. But even there I run into a performance issue.
Because there is so much data to summarize, my query initially took several minutes to run, and sometimes even caused my .NET app to receive a command timeout error. Having read many articles about SQL query optimization, I knew that my best bet was to create a covering index on my data table. Sure enough, after adding the covering index, the query dropped to 4 seconds.
So in roughly 45 seconds, the 140MB file is loaded and converted into 2,100 cash receipt transactions, ready to be handed off to eConnect. eConnect then becomes the bottleneck, taking a few minutes to create the cash receipts, apply the payments to invoices, and then create debit and credit memos for any over or under payments.
Needless to say the customer is amazed and thrilled with the new solution. Compared to the manual process that they have been performing to get the data into GP, the custom integration is like magic.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Wednesday, February 8, 2012
Update Dynamics GP Notes Using SQL
By Steve Endow
In the last few weeks I've had two integration projects that required me to update notes in Dynamics GP.
One customer needed to import note updates from another system into existing Service Call notes in Dynamics GP. The external system was able to provide the data in a SQL staging table, and I had to import the note data into Service Calls via SQL, appending the information to existing notes, or creating new notes if they didn't already exist.
The second was an eConnect 2010 RM cash receipt import where I needed to attach a note to some of the cash receipts. Imagine my surprise when I realized that the RM Cash Receipt eConnect transaction does not have a NOTETEXT field! Why not??? eConnect inserts a cash receipt record with a NOTEINDX value of zero! Who's the summer intern who cut corners?
I looked for a stored procedure that might handle the note inserts and updates, but a trace of the GP windows seemed to indicate that GP just used straight INSERT and UPDATE statements. But the GP note windows always send the full note text in the SQL for their updates, which is straightforward. I needed the ability to insert a new note as well as append text to an existing note--ideally without retrieving the existing text first.
So I searched for a note insert or update stored procedure in the GP databases but found no such thing. After a few attempts to find a solution online, I gave up and decided to roll my own.
It has been years since I have had to deal directly with SQL Server text fields. Although it isn't "hard", the syntax to update a text field is far from obvious. But with minimal Google-fu skills, you will find plenty of examples online.
To make my life easier, I packaged up my script into a stored procedure that accepts the note index and the note text to append. Also for simplicity, I use a varchar(8000) parameter for the text update. And finally, I include newline characters to add white space before note updates.
And if you have to generate note indexes like I did for the RM Cash Receipt transactions, you can use the DYNAMICS.dbo.smGetNextNoteIndex stored procedure.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
In the last few weeks I've had two integration projects that required me to update notes in Dynamics GP.
One customer needed to import note updates from another system into existing Service Call notes in Dynamics GP. The external system was able to provide the data in a SQL staging table, and I had to import the note data into Service Calls via SQL, appending the information to existing notes, or creating new notes if they didn't already exist.
The second was an eConnect 2010 RM cash receipt import where I needed to attach a note to some of the cash receipts. Imagine my surprise when I realized that the RM Cash Receipt eConnect transaction does not have a NOTETEXT field! Why not??? eConnect inserts a cash receipt record with a NOTEINDX value of zero! Who's the summer intern who cut corners?
I looked for a stored procedure that might handle the note inserts and updates, but a trace of the GP windows seemed to indicate that GP just used straight INSERT and UPDATE statements. But the GP note windows always send the full note text in the SQL for their updates, which is straightforward. I needed the ability to insert a new note as well as append text to an existing note--ideally without retrieving the existing text first.
So I searched for a note insert or update stored procedure in the GP databases but found no such thing. After a few attempts to find a solution online, I gave up and decided to roll my own.
It has been years since I have had to deal directly with SQL Server text fields. Although it isn't "hard", the syntax to update a text field is far from obvious. But with minimal Google-fu skills, you will find plenty of examples online.
To make my life easier, I packaged up my script into a stored procedure that accepts the note index and the note text to append. Also for simplicity, I use a varchar(8000) parameter for the text update. And finally, I include newline characters to add white space before note updates.
IF EXISTS(SELECT * FROM sys.objects WHERE type = 'P' AND name = 'csspInsertUpdateNote')
DROP PROCEDURE csspInsertUpdateNote
GO
CREATE PROCEDURE csspInsertUpdateNote
@NOTEINDX AS int,
@NOTETEXT varchar(8000)
AS
BEGIN
SET NOCOUNT ON;
DECLARE @TXTFIELD varbinary(16);
DECLARE @LENGTH int;
DECLARE @CRLF varchar(10);
DECLARE @APPEND varchar(8000);
DECLARE @GPVERSION int;
--GP 2010 uses CRLF for notes, whereas GP 9 and 10 use CR only
SELECT TOP 1 @GPVERSION = versionMajor FROM DYNAMICS..DU000020 WHERE COMPANYID = -32767 AND PRODID = 0
IF @GPVERSION >= 11
SET @CRLF = CHAR(13)+CHAR(10);
ELSE
SET @CRLF = CHAR(13);
SET @APPEND = @CRLF + @NOTETEXT;
--Check if a note record exists
IF (SELECT COUNT(*) FROM SY03900 WHERE NOTEINDX = @NOTEINDX) = 0
BEGIN
--If not, insert a new note record with the update
INSERT INTO SY03900 (NOTEINDX, DATE1, TIME1, TXTFIELD) VALUES (@NOTEINDX, DATEADD(dd, 0, DATEDIFF(dd, 0, GETDATE())), CONVERT(VARCHAR(8),GETDATE(),108), @NOTETEXT)
END
ELSE
BEGIN
--If so, update the existing note
--Get the text pointer for the note
SELECT @TXTFIELD = TEXTPTR(TXTFIELD), @LENGTH = DATALENGTH(TXTFIELD) FROM SY03900 WHERE NOTEINDX = @NOTEINDX;
UPDATETEXT SY03900.TXTFIELD @TXTFIELD @LENGTH 0 @APPEND;
END
END
GO
GRANT EXEC ON csspInsertUpdateNote TO DYNGRP
GO
And if you have to generate note indexes like I did for the RM Cash Receipt transactions, you can use the DYNAMICS.dbo.smGetNextNoteIndex stored procedure.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified IT Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Friday, February 3, 2012
Management Reporter Dimension Sets and Dynamics AX
Okay, okay, I am not AX knowledgeable...but, I have been working on some Dynamics AX implementations recently with Management Reporter. And I came across an issue that I thought I would share in case there are others out there experiencing the same thing.
We created a number of reports with dimension sets, prior to their being data in the system. So when their was data loaded, we started running the reports...and there were no numbers! Yikes. We isolated that the rows without data were all using dimension sets. And if we used the actual dimensions instead of dimension sets, we would get data.
Erik Johnson from Microsoft took a look with me today, and confirmed that it seemed to be affecting any dimension set with more than one dimension in it. So, for example, if I set up a dimension set with one department only or one range of departments, it works. More than one department it doesn't.
It seems like it may be an issue with the most recent version of the data provider for Dynamics AX, as it can't be replicated on earlier versions of the provider. The version that seems to be affected is 2.0.1700.75.
I'll share more info as I get it! Have a great weekend!
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
We created a number of reports with dimension sets, prior to their being data in the system. So when their was data loaded, we started running the reports...and there were no numbers! Yikes. We isolated that the rows without data were all using dimension sets. And if we used the actual dimensions instead of dimension sets, we would get data.
Erik Johnson from Microsoft took a look with me today, and confirmed that it seemed to be affecting any dimension set with more than one dimension in it. So, for example, if I set up a dimension set with one department only or one range of departments, it works. More than one department it doesn't.
It seems like it may be an issue with the most recent version of the data provider for Dynamics AX, as it can't be replicated on earlier versions of the provider. The version that seems to be affected is 2.0.1700.75.
I'll share more info as I get it! Have a great weekend!
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Subscribe to:
Posts (Atom)