eConnect for Microsoft Dynamics GP is fantastic. It's easy for a .NET developer to use, it's fairly comprehensive, it's fairly well documented, and it's relatively fast. Except for the occasional missing field, unsupported transaction or bug, I love working with it.
Unfortunately, I have found that among non-technical folks, there is often a misconception about what it takes to develop a complete eConnect custom integration. I am guessing that they think that eConnect should be no different than Integration Manager, where in a few hours or less, you should be able to produce a fancy integration.
It's rarely that simple. While trying to explain to a GP partner why a POP + SOP + Project Accounting integration would really take 40 hours to develop, I came up with this way of describing it: eConnect is the easy part. That's the 10% of the equation. It's a standard API with good documentation. It's a known quantity. The other 90% is dealing with the client's source data file, developing business logic for processing the data, and handling exceptions in that data. And don't forget testing, debugging, minor changes, installation, and documentation.
Sure, I have had situations where customers provided a very simple and clean data file that went straight into GP. No data transformations, no lookups, no exceptions or errors--a perfect integration. Those are simple and easy. But those situations are generally rare--in part because such integrations can often be done with Integration Manager or even a table import (not that I'd recommend a table import...).
The real power and some of the greatest benefits of custom eConnect integrations comes with complex integrations. And with such integrations, it isn't eConnect that provides the benefit as much all of the extra logic that can be included with custom integration. For example, importing a customer with eConnect is essentially always the same. Customer object, address object, populate the properties, and send it off to eConnect. Simple.
But when the customer data is in multiple files, or different data formats, or the address isn't parsed at all or is incomplete, or the data needs transformation, or has exceptions that need to be handled, that is when additional custom .NET code can come into play to make the integration much more valuable than a generic data import.
A timely example: I had a phone call today with a client that receives over 600,000 cash receipts every month, a number that is rapidly growing. What they actually receive is a single giant payment from an external trading partner, and that trading partner also provides them with a 150MB text file containing the list of every customer payment. Not only do these cash receipts need to be entered, but they also need to be applied to invoices--so think of it as over 1,000,000 transactions. Every month. They would like to automate this process to eliminate the 3 full days of work that are required to enter and apply the cash receipts. In concept that doesn't sound so bad, and eConnect can obviously handle that.
So I then ask if there are any exceptions in the data that are handled manually, or problems that require attention before the cash receipt can be entered or applied. "No, not really" the customer answered. Hmmm, that seems unlikely.
About 30 seconds later, I ask them which column in the data file is the Dynamics GP customer ID. It turns out that the customer ID is not in the data file--only the customer name. Sometimes the customer name is the same as GP, sometimes it's similar, and sometimes it is completely different. The accounting staff manually looks up each customer name in GP to determine which GP customer sent the payment. I don't know about you, but I would call that a giant 600,000 line exception. Okay, so we can setup an easy method to map the customer name in the data file to the GP customers. Not a problem, but note that this has nothing to do with eConnect.
I asked about the additional worksheet in Excel where some of the data had been separated. "Oh, that's a different type of cash receipt, we handle those separately." So now we have regular cash receipts, and special cash receipts. Another exception, to be handled well before eConnect gets involved.
And what about the payment applications? Any issues with those? "No, those are pretty standard." But, the first cash receipt we look at in GP is not applied to an invoice, because there is no invoice available for that customer. Another exception. And then we find an overpayment. Another exception. And then an underpayment. Another exception. After discussing overpayments and underpayments, they explain that they want debit and credit memos automatically created, and then applied to the customer's credit or debit balance. But only if the difference is within a certain threshold, such as under $100. If the difference is greater than $100, it should appear on an exception report requiring a review. Makes total sense, and mimics their manual processes. But eConnect is still only peripheral at this point.
The requirements and exception list goes on, and I now have a good idea of what needs to be done. And you know what? The LEAST of my concerns is eConnect. It's dealing with the 150MB data file and the thousands of exceptions that the integration is going to need to handle during the import.
And when an exception or error occurs, I'll need to create an exception file so that they can deal with just the 1,000 records with errors instead of swimming through the massive 600,000 line data file. And don't forget reporting--they will need a report or log listing how the cash receipts were imported and the invoices to which they were applied.
It's a fantastic project, and I can't wait to help them turn a 3 day ordeal into a simple 15 minute process. But roughly 90% of my work will be wrestling with the source data, while only 10% will be working directly with eConnect.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
My blog has moved! Please visit the new blog at: https://blog.steveendow.com/ I will no longer be posting to Dynamics GP Land, and all new posts will be at https://blog.steveendow.com Thanks!
Friday, May 20, 2011
Wednesday, May 18, 2011
Dynamics GP .NET eConnect Developer Wanted
5/18/2011
I hear and read news reports saying that the US economy is still in bad shape, but I am having a hard time reconciling that news with the continuous projects that I've been frantically working on for the last 6 months. And when I've reached out to a few GP .NET developers that I know to see if they can help with a few projects, they too have typically been fully booked, working late nights just to keep up.
So, if you are a Dynamics GP .NET eConnect and VS Tools developer, or if you know of one, I'd love to chat.
I'm looking to have additional resources available for those times when 5 clients suddenly want work done immediately, or when I'm swamped and a new project pops up out of the blue.
Ideally, I'm looking for developers located in North America who are independent contractors.
He or she should be proficient with Visual Studio, C# and VB, as well as eConnect, Visual Studio Tools for GP, and SQL Server. Strong understanding of Dynamics GP functionality, distributions, GL accounts, and debits and credits is desirable as well.
I can be reached at steveendow@gmail.com.
I hear and read news reports saying that the US economy is still in bad shape, but I am having a hard time reconciling that news with the continuous projects that I've been frantically working on for the last 6 months. And when I've reached out to a few GP .NET developers that I know to see if they can help with a few projects, they too have typically been fully booked, working late nights just to keep up.
So, if you are a Dynamics GP .NET eConnect and VS Tools developer, or if you know of one, I'd love to chat.
I'm looking to have additional resources available for those times when 5 clients suddenly want work done immediately, or when I'm swamped and a new project pops up out of the blue.
Ideally, I'm looking for developers located in North America who are independent contractors.
He or she should be proficient with Visual Studio, C# and VB, as well as eConnect, Visual Studio Tools for GP, and SQL Server. Strong understanding of Dynamics GP functionality, distributions, GL accounts, and debits and credits is desirable as well.
I can be reached at steveendow@gmail.com.
And a One, And a Two, SQL Report Security
One of the most valuable (and yet underutilized) benefits of Microsoft Dynamics GP is the standard SQL Reporting Services (SSRS) reports that are available. These reports are valuable for a number of reasons including:
• Delivery via web to non-GP users
• Easy export capability to Excel, Word
• Ability to modify existing reports and access additional information
• Ad-hoc reporting capabilities using standard Report Models
• Subscription capabilities within SQL Reporting Services (delivers the report to your inbox!)
I know I have other posts dedicated to my love of SSRS, so I will spare you another love letter for now. But I do want to chat a bit about SQL Reporting Services security.
It seems like Dynamics GP users fall in to three main categories when it comes to security:
1. Everyone sees everything- Most common when the finance department is small and there is not a lot of delineation of duties. So often in GP, all users are POWERUSERS.
2. Limited access for convenience only- Maybe some security is defined, but only to help simplify the user experience. From a reporting perspective, there are no concerns about “protecting” information. This is usually the case when there is a delineation of duties, but due to overlap and backups, everyone knows everything anyway.
3. Limited access out of necessity- Security is defined not only to simplify the user experience, but also to protect data from unauthorized access. This is very common when payroll and human resources are included in the system, or in larger organizations with multiple offices/companies/entities to consider.
In the first two scenarios above, SQL Reporting security is not much of a concern. But in the third scenario, we would really need to be measured in our approach to ensure data is appropriately secured. I think, sometimes, it is assumed that customers fall in to #1 or #2 because of our understanding of their GP user base. But we need to be careful and consider the user base for SQL Reporting, and ensure that the customer understands its security model as well.
SQL Reporting security can be broadly categorized in to three areas:
• Site Security
• Item Security
• Data Security
Site Security is defined in Report Manager (generally http://servername/reports) using the Site Settings link. This is where you define who has permissions to manage and administer site level settings. Users do not have to be defined in Site Security to access and use SQL Reports. They only need this level of access to allow them to manage aspects of the site like system properties and schedules.
Item Security is also defined in Report Manager. This controls access to the folders and reports on the site. This can be accessed by choosing Properties or Security from the drop down list that appears when you click on an item. Or by using the Folder Settings button. By default, when an Windows user or group is assigned to an item in Report Manager, all items below inherit the access. For example, if a user is granted access to a folder, the same user also has access to the reports in the folder. This parent-level security inheritance can be broken by item by accessing security for the item. For example, if the user shouldn’t have access to one report within the folder, you can access security for that report and delete the user (therefore “breaking” the parent-level security inheritance).
Okay, so here is where it gets confusing :) Item level security controls access to the report. HOWEVER, being able to get to the report is NOT the same as being able to generate the report. This is where data security comes in to play. This can take on a lot of angles, depending on how you have configured the data source for your report. But let’s assume that you have configured it to use Windows credentials (that are not stored in the data source).
Data Security is defined in SQL Server Management Studio. This security controls the user’s ability to access the tables, stored procedures, and views necessary to generate reports. Keep in mind that although we are defining security for use with SQL Reporting Services, the data level security you define also allows users to access the data through Excel, Crystal Reports, Access, or another ODBC data source. For this reason, it is important that Data Security must be as restrictive as your requirements necessitate to prevent unauthorized access to data. Simply restricting access at the Item Level would only limit the user’s ability to access the data using that particular medium. If Data Security gives them access to the underlying data, they could access it with Excel, Access, etc even if they can’t access the SQL Report itself.
To configure Data Security in SQL Server Management Studio, the user must first be set up under Security>>Logins using their Windows login. Then, using the User Mapping tab in User Properties, you must map the user to the appropriate company databases and the corresponding database roles (to grant access to the necessary tables, views, and stored procedures to generate the reports).
Fortunately, Microsoft Dynamics GP comes with a number of predefined roles that begin with “rpt” specifically for use with the SQL reports. These roles have access to specific tables, views, and stored procedures. And the access is restricted so that data cannot be updated, only selected/viewed. Remember, whatever access you give them at the Data level can be used with Excel, Access, etc. So it is important, if you use roles outside of the standard “rpt” roles provided by Dynamics GP , to ensure that the roles do not grant additional permissions that you would not normally allow (e.g., updating tables, accessing additional tables, etc)
When planning security for your SQL reports, it is important to consider all three levels of security. There are resources on CustomerSource to assist with this process including:
Frequently Asked Questions About SQL Reporting Services and Dynamics GP
Download the SQL Reporting Services Administration Guide (last updated for GP 10), which includes additional information on security
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
• Delivery via web to non-GP users
• Easy export capability to Excel, Word
• Ability to modify existing reports and access additional information
• Ad-hoc reporting capabilities using standard Report Models
• Subscription capabilities within SQL Reporting Services (delivers the report to your inbox!)
I know I have other posts dedicated to my love of SSRS, so I will spare you another love letter for now. But I do want to chat a bit about SQL Reporting Services security.
It seems like Dynamics GP users fall in to three main categories when it comes to security:
1. Everyone sees everything- Most common when the finance department is small and there is not a lot of delineation of duties. So often in GP, all users are POWERUSERS.
2. Limited access for convenience only- Maybe some security is defined, but only to help simplify the user experience. From a reporting perspective, there are no concerns about “protecting” information. This is usually the case when there is a delineation of duties, but due to overlap and backups, everyone knows everything anyway.
3. Limited access out of necessity- Security is defined not only to simplify the user experience, but also to protect data from unauthorized access. This is very common when payroll and human resources are included in the system, or in larger organizations with multiple offices/companies/entities to consider.
In the first two scenarios above, SQL Reporting security is not much of a concern. But in the third scenario, we would really need to be measured in our approach to ensure data is appropriately secured. I think, sometimes, it is assumed that customers fall in to #1 or #2 because of our understanding of their GP user base. But we need to be careful and consider the user base for SQL Reporting, and ensure that the customer understands its security model as well.
SQL Reporting security can be broadly categorized in to three areas:
• Site Security
• Item Security
• Data Security
Site Security is defined in Report Manager (generally http://servername/reports) using the Site Settings link. This is where you define who has permissions to manage and administer site level settings. Users do not have to be defined in Site Security to access and use SQL Reports. They only need this level of access to allow them to manage aspects of the site like system properties and schedules.
Item Security is also defined in Report Manager. This controls access to the folders and reports on the site. This can be accessed by choosing Properties or Security from the drop down list that appears when you click on an item. Or by using the Folder Settings button. By default, when an Windows user or group is assigned to an item in Report Manager, all items below inherit the access. For example, if a user is granted access to a folder, the same user also has access to the reports in the folder. This parent-level security inheritance can be broken by item by accessing security for the item. For example, if the user shouldn’t have access to one report within the folder, you can access security for that report and delete the user (therefore “breaking” the parent-level security inheritance).
Okay, so here is where it gets confusing :) Item level security controls access to the report. HOWEVER, being able to get to the report is NOT the same as being able to generate the report. This is where data security comes in to play. This can take on a lot of angles, depending on how you have configured the data source for your report. But let’s assume that you have configured it to use Windows credentials (that are not stored in the data source).
Data Security is defined in SQL Server Management Studio. This security controls the user’s ability to access the tables, stored procedures, and views necessary to generate reports. Keep in mind that although we are defining security for use with SQL Reporting Services, the data level security you define also allows users to access the data through Excel, Crystal Reports, Access, or another ODBC data source. For this reason, it is important that Data Security must be as restrictive as your requirements necessitate to prevent unauthorized access to data. Simply restricting access at the Item Level would only limit the user’s ability to access the data using that particular medium. If Data Security gives them access to the underlying data, they could access it with Excel, Access, etc even if they can’t access the SQL Report itself.
To configure Data Security in SQL Server Management Studio, the user must first be set up under Security>>Logins using their Windows login. Then, using the User Mapping tab in User Properties, you must map the user to the appropriate company databases and the corresponding database roles (to grant access to the necessary tables, views, and stored procedures to generate the reports).
Fortunately, Microsoft Dynamics GP comes with a number of predefined roles that begin with “rpt” specifically for use with the SQL reports. These roles have access to specific tables, views, and stored procedures. And the access is restricted so that data cannot be updated, only selected/viewed. Remember, whatever access you give them at the Data level can be used with Excel, Access, etc. So it is important, if you use roles outside of the standard “rpt” roles provided by Dynamics GP , to ensure that the roles do not grant additional permissions that you would not normally allow (e.g., updating tables, accessing additional tables, etc)
When planning security for your SQL reports, it is important to consider all three levels of security. There are resources on CustomerSource to assist with this process including:
Frequently Asked Questions About SQL Reporting Services and Dynamics GP
Download the SQL Reporting Services Administration Guide (last updated for GP 10), which includes additional information on security
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Implementation Trauma AGAIN!!
So, continuing on my theme, I'm back with number two on our list of sins from a few weeks ago. Let's take a bit closer look at how these can sink an implementation from the start. As a refresher, number two on the list for consultants was "Forgetting that customer service is important even during the heat of an implementation" and the companion to this for customers is "Approaching the consulting team as adversaries instead of partners."
Both of these may seem obvious when we see them in writing, but in the stress of an implementation, these items are often forgotten. And when that happens, a positive experience can quickly turn sour. First, let's look at the customer service sin. Consider the following scenario:
Ralph is a good consultant. He has put in long hours on the implementation, and has worked hard to keep it on time and under budget. He has also been diligent in training and documenting every step of the way. So when he receives a call from the client asking the same question, yet again, he lets his frustration slip through. Maybe he lets the voicemail linger a bit longer before returning it. Or maybe he emails back a reference to the documentation or the training, without actually answering the question. Many of you reading this might wonder, so what? Don't customers need to take responsibility for their learning? Sure they do. And, yet, we have to continue to approach customers with good faith throughout the process. This means assuming that they are asking because THEY DON"T KNOW THE ANSWER. Not because they are lazy, or they weren't paying attention, or even that they aren't smart (we all have witnessed this consulting sin-- speaking to users like they aren't intelligent). Yes, there may be a larger issue to be addressed regarding user investment in the process, and yes, users who are too dependent on consultants can create budget issues in the long run. But, from a customer service perspective, we have to handle these issues professionally. Answer the question, and escalate the risk to your project manager or sponsor.
On a side note, whenever I find myself frustrated with these sorts of things-- I remind myself of every consulting horror story I have been told by customers. You know the ones. Where you can't believe that the customer who you love working with was treated so badly. It tends to snap me back in to the customer service mindset pretty quickly.
Another variation on this sin is when the consultant becomes too familiar with the client, and their interaction becomes so casual that it ignores the fact that customer service is involved. This most often manifests itself as "griping". About the project. About the software. About (oh, my) fellow consultants and/or client team members. Yes, you might become friends with clients over the years, but you can never forget that within the confines of a project they are still your customer and deserve the service and discretion that comes along with that. After all, they are paying for that :)
So, let's flip over to the customer side for awhile. The sin of cultivating an adversarial relationship with the consultant is one that also often intensifies towards the end of an implementation. The customer's staff is tired and minor bumps become more significant. Consider the following scenario:
ABC Company has been through a number of software implementations in the past few years. And none of them have gone very well. They like their current consultant personally, but the customer feels that consulting is not about sharing knowledge or customer service- its about billing. A lot. So they want to get what they can from the consultants, but they really don't trust them. And over the course of the project, there have been a few minor bumps that have arisen (new requirements, additional costs, etc) that have just reinforced this perception. So, with the go live in sight, the stress level is high and the customer feels alone in the process with no one to trust.
As a customer, you need to pick consultants you can see yourself trusting when that trust is earned. If you feel "scarred" by prior experiences, examine why the other projects failed. Take those lessons in to your new project, and share those concerns with your consultants and ask them for their help in mitigating them. You want to form a partnership built on this trust, after all you are relying on these people to guide you in configuring your system and business processes. Why would you pick someone you don't trust? And let them in to your business' nerve center? Along those same lines, as I discussed on the consulting side, you need to approach the process with good faith whenever possible. This means being honest about your concerns with the consulting team, assisting them in the identification and management of risks. Acknowledging that all projects encounter changes, but what matters is how these changes are uncovered and how they are addressed. And this can be done best when there is a partnership between the customer and consultants.
As a consultant, the projects that are approached in this way are the most rewarding to work on and the ones that I am most proud of professionally and personally. And, I have to say, most GOOD consultants take a lot of personal pride in their work and in happy customers! So find one of them to work with :)
A bit long winded tonight, but I hope you find the ramblings useful :)
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Both of these may seem obvious when we see them in writing, but in the stress of an implementation, these items are often forgotten. And when that happens, a positive experience can quickly turn sour. First, let's look at the customer service sin. Consider the following scenario:
Ralph is a good consultant. He has put in long hours on the implementation, and has worked hard to keep it on time and under budget. He has also been diligent in training and documenting every step of the way. So when he receives a call from the client asking the same question, yet again, he lets his frustration slip through. Maybe he lets the voicemail linger a bit longer before returning it. Or maybe he emails back a reference to the documentation or the training, without actually answering the question. Many of you reading this might wonder, so what? Don't customers need to take responsibility for their learning? Sure they do. And, yet, we have to continue to approach customers with good faith throughout the process. This means assuming that they are asking because THEY DON"T KNOW THE ANSWER. Not because they are lazy, or they weren't paying attention, or even that they aren't smart (we all have witnessed this consulting sin-- speaking to users like they aren't intelligent). Yes, there may be a larger issue to be addressed regarding user investment in the process, and yes, users who are too dependent on consultants can create budget issues in the long run. But, from a customer service perspective, we have to handle these issues professionally. Answer the question, and escalate the risk to your project manager or sponsor.
On a side note, whenever I find myself frustrated with these sorts of things-- I remind myself of every consulting horror story I have been told by customers. You know the ones. Where you can't believe that the customer who you love working with was treated so badly. It tends to snap me back in to the customer service mindset pretty quickly.
Another variation on this sin is when the consultant becomes too familiar with the client, and their interaction becomes so casual that it ignores the fact that customer service is involved. This most often manifests itself as "griping". About the project. About the software. About (oh, my) fellow consultants and/or client team members. Yes, you might become friends with clients over the years, but you can never forget that within the confines of a project they are still your customer and deserve the service and discretion that comes along with that. After all, they are paying for that :)
So, let's flip over to the customer side for awhile. The sin of cultivating an adversarial relationship with the consultant is one that also often intensifies towards the end of an implementation. The customer's staff is tired and minor bumps become more significant. Consider the following scenario:
ABC Company has been through a number of software implementations in the past few years. And none of them have gone very well. They like their current consultant personally, but the customer feels that consulting is not about sharing knowledge or customer service- its about billing. A lot. So they want to get what they can from the consultants, but they really don't trust them. And over the course of the project, there have been a few minor bumps that have arisen (new requirements, additional costs, etc) that have just reinforced this perception. So, with the go live in sight, the stress level is high and the customer feels alone in the process with no one to trust.
As a customer, you need to pick consultants you can see yourself trusting when that trust is earned. If you feel "scarred" by prior experiences, examine why the other projects failed. Take those lessons in to your new project, and share those concerns with your consultants and ask them for their help in mitigating them. You want to form a partnership built on this trust, after all you are relying on these people to guide you in configuring your system and business processes. Why would you pick someone you don't trust? And let them in to your business' nerve center? Along those same lines, as I discussed on the consulting side, you need to approach the process with good faith whenever possible. This means being honest about your concerns with the consulting team, assisting them in the identification and management of risks. Acknowledging that all projects encounter changes, but what matters is how these changes are uncovered and how they are addressed. And this can be done best when there is a partnership between the customer and consultants.
As a consultant, the projects that are approached in this way are the most rewarding to work on and the ones that I am most proud of professionally and personally. And, I have to say, most GOOD consultants take a lot of personal pride in their work and in happy customers! So find one of them to work with :)
A bit long winded tonight, but I hope you find the ramblings useful :)
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.
Fun eConnect Bug: Check those field lengths!
Today I deployed an updated version of an AR Invoice eConnect integration. The customer has used the integration previously, and it worked fine, but when we ran it with a new batch of data, the Edit List report indicated that some transactions had missing distributions or incorrect distribution amounts.
Puzzled, I checked the transactions and their distributions in Dynamics GP, and everything looked fine. But when I ran the Edit List again, the same errors were listed--except for any transactions that I had opened and viewed.
Hmmmmm.
So I queried the RM Distribution Work table, RM10101 to see what was going on.
For the one transaction that I had opened and viewed in GP, I saw four distribution lines instead of the two that should be present. I was pretty sure I didn't import four lines, but I wasn't sure where the other two came from.
After looking closely at the four records, I noticed one discrepancy. The DOCNUMBR field value for two of the lines was different than the other two distribution lines. One had the full document number that was imported, but the other had a truncated document number, missing the last 4 characters. And the account indexes for the two sets of distributions were different.
And that is when it dawned on me.
Back in 2008, I wrote this blog post discussing how eConnect handles character values that exceed the length of the eConnect / GP field. Normally, it truncates character data values without any warning or error and proceeds happily, letting you discover that you have some data missing.
Well, in this case, the client had an invalid AR invoice number that was 20 characters long, due to a change in data formatting. eConnect happily imported the invoice, truncating the invoice number to 17 characters.
But, it appears that there is a minor 'bug' in eConnect 10, as the AR distribution imported the full 20 characters into RM10101. No truncation!
So, in the RM10301 Sales Work table, we have a 17 character invoice number, and in RM10101 Distribution Work table, we have a 20 character invoice number. Naturally, when the GP edit list tries to match the invoices to distributions, it can't match the two numbers, so it appears as if the invoice does not have any distributions at all. And when I opened the invoice distribution window, everything looked fine because GP was defaulting distributions based on the customer account settings. And when the distribution window is opened, GP creates new distribution records, but using the customer default accounts.
In short, the distributions that were imported with the full 20 character invoice number are "orphaned". They are not associated with any invoice, and even if you delete the invoices, the distributions will not be deleted. They will have to be cleaned out manually or via Check Links.
So, the lesson here is that you shouldn't rely on eConnect to manage the length of your character field values. Ideally, you should include field length checks in your integration validation code to detect data values that may exceed a GP field length.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Puzzled, I checked the transactions and their distributions in Dynamics GP, and everything looked fine. But when I ran the Edit List again, the same errors were listed--except for any transactions that I had opened and viewed.
Hmmmmm.
So I queried the RM Distribution Work table, RM10101 to see what was going on.
For the one transaction that I had opened and viewed in GP, I saw four distribution lines instead of the two that should be present. I was pretty sure I didn't import four lines, but I wasn't sure where the other two came from.
After looking closely at the four records, I noticed one discrepancy. The DOCNUMBR field value for two of the lines was different than the other two distribution lines. One had the full document number that was imported, but the other had a truncated document number, missing the last 4 characters. And the account indexes for the two sets of distributions were different.
And that is when it dawned on me.
Back in 2008, I wrote this blog post discussing how eConnect handles character values that exceed the length of the eConnect / GP field. Normally, it truncates character data values without any warning or error and proceeds happily, letting you discover that you have some data missing.
Well, in this case, the client had an invalid AR invoice number that was 20 characters long, due to a change in data formatting. eConnect happily imported the invoice, truncating the invoice number to 17 characters.
But, it appears that there is a minor 'bug' in eConnect 10, as the AR distribution imported the full 20 characters into RM10101. No truncation!
So, in the RM10301 Sales Work table, we have a 17 character invoice number, and in RM10101 Distribution Work table, we have a 20 character invoice number. Naturally, when the GP edit list tries to match the invoices to distributions, it can't match the two numbers, so it appears as if the invoice does not have any distributions at all. And when I opened the invoice distribution window, everything looked fine because GP was defaulting distributions based on the customer account settings. And when the distribution window is opened, GP creates new distribution records, but using the customer default accounts.
In short, the distributions that were imported with the full 20 character invoice number are "orphaned". They are not associated with any invoice, and even if you delete the invoices, the distributions will not be deleted. They will have to be cleaned out manually or via Check Links.
So, the lesson here is that you shouldn't rely on eConnect to manage the length of your character field values. Ideally, you should include field length checks in your integration validation code to detect data values that may exceed a GP field length.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
New Dynamics GP 2010 Reporting Book and Packt Books Dynamics Month!
Congratulations to Christopher Liley and David Duncan for publishing their new book, Dynamics GP 2010 Reporting.
Reporting in Dynamics GP is not exactly the most user friendly experience, and is often overlooked or given insufficient attention. As a result, there are dozens of powerful reporting features that are never used by most Dynamics GP customers. It's great to see that there is now a book dedicated to the topic.
Celebrating the publication of their third Dynamics GP book and fourteenth Dynamics book, Packt has also declared May 2011 "Dynamics Month", so make sure to stop by their site and take advantage of the discounts that they are offering.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Reporting in Dynamics GP is not exactly the most user friendly experience, and is often overlooked or given insufficient attention. As a result, there are dozens of powerful reporting features that are never used by most Dynamics GP customers. It's great to see that there is now a book dedicated to the topic.
Celebrating the publication of their third Dynamics GP book and fourteenth Dynamics book, Packt has also declared May 2011 "Dynamics Month", so make sure to stop by their site and take advantage of the discounts that they are offering.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Tuesday, May 17, 2011
One Very Hairy Dynamics GP Integration
I develop a lot of custom Microsoft Dynamics GP integrations. A few are simple and straight forward. The majority have some quirks that require a few tricks or workarounds in .NET. And a few, well, they remind me that my imagination definitely has limits when it comes to envisioning complexities with certain businesses processes and related integrations.
I've been working on one particular integration over the course of a few months that has been vastly more challenging and complex than I could have ever possibly imagined. Believe it or not, it's a Purchasing Receipt import.
I know, I can hear it now: "Purchasing Receipt? You kidding? How hard could that possibly be?"
Let's start with the source data file. It's an Excel file, which is fine, but it is formatted as a report, with a report header in the first three rows (merged cells) and column headers on row 5. But there are only 7 column headers for 11 columns of data. Fine, I can read the file without headers, no problem.
And then the fields. There are only 3 fields related to the receipt transaction in GP: PO Number, Item Number, and Quantity.
Okay, so I have to lookup the PO number to find the vendor ID, no biggie. And sure, I can use the vendor ID and item number to lookup the vendor item number. Not so bad.
Except that some of the current vendor item numbers don't match the vendor item number on the PO. Sometimes the PO shows the vendor item number, sometimes it's the item number, and sometimes it's something else entirely--perhaps an older vendor item number. Oh, and sometimes even the item number won't match.
Okay....so I can add some logic that will figure out which PO line is being received by checking the vendor item number, then the item number, and then, believe it or not, the item description.
Whew, all done, right? Not so fast.
For some reason, many of the POs have the same item on multiple lines. Not 2 or 3 lines--we're talking 10 lines or more. So I can't just send the PO number and item number to eConnect--I have to know which PO line I'm receiving against. Oooookay, I now have a routine that can figure that out.
Almost done, right? Nope.
Did I mention that the receipt quantities sometimes exceed the PO quantities, and those excess quantities need to be received too, in some situations? Well, eConnect allows you to receive excess quantities if a PO line is not fully received, but if the line has already been fully received by a prior receipt, eConnect will refuse to receive additional quantities on that PO line. So if you have a PO line with a quantity of 20, and 20 have been received, but you need to receive 4 more, what do you do?
Well, I learned that eConnect will not allow you to adjust the quantity of a PO line that has been received. And it will also apparently not allow you to add a new line to an existing PO. So that just leaves...creating a new PO to receive the excess quantity.
You still with me?
So now that I've created the new PO, I need to pair it up with the receipt line. And then I need to write the new PO number back to the Excel file so that it can be traced or researched if necessary.
Oh, did I mention that there are POs that were created before the inventory item records were setup in GP? So not only do some PO lines have an "incorrect" item type status, but the corresponding vendor item record is not yet setup. So I need to "fix" those PO lines, and create a vendor item record.
And the list actually goes on and on from there--seriously, there really is alot more, but I assume you get the point. What at first seemed like a relatively simple endeavor ended up having requirements that have left me in amazement. Nearly every issue that could exist with the data and receipt process did exist.
I will no longer look at purchasing receipts the same ever again.
Which leads me to the general topic of designing and estimating custom eConnect integrations, which I'll discuss in another post: "The 90/10 Rule: 90 Percent Source Data, 10 Percent eConnect".
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
I've been working on one particular integration over the course of a few months that has been vastly more challenging and complex than I could have ever possibly imagined. Believe it or not, it's a Purchasing Receipt import.
I know, I can hear it now: "Purchasing Receipt? You kidding? How hard could that possibly be?"
Let's start with the source data file. It's an Excel file, which is fine, but it is formatted as a report, with a report header in the first three rows (merged cells) and column headers on row 5. But there are only 7 column headers for 11 columns of data. Fine, I can read the file without headers, no problem.
And then the fields. There are only 3 fields related to the receipt transaction in GP: PO Number, Item Number, and Quantity.
Okay, so I have to lookup the PO number to find the vendor ID, no biggie. And sure, I can use the vendor ID and item number to lookup the vendor item number. Not so bad.
Except that some of the current vendor item numbers don't match the vendor item number on the PO. Sometimes the PO shows the vendor item number, sometimes it's the item number, and sometimes it's something else entirely--perhaps an older vendor item number. Oh, and sometimes even the item number won't match.
Okay....so I can add some logic that will figure out which PO line is being received by checking the vendor item number, then the item number, and then, believe it or not, the item description.
Whew, all done, right? Not so fast.
For some reason, many of the POs have the same item on multiple lines. Not 2 or 3 lines--we're talking 10 lines or more. So I can't just send the PO number and item number to eConnect--I have to know which PO line I'm receiving against. Oooookay, I now have a routine that can figure that out.
Almost done, right? Nope.
Did I mention that the receipt quantities sometimes exceed the PO quantities, and those excess quantities need to be received too, in some situations? Well, eConnect allows you to receive excess quantities if a PO line is not fully received, but if the line has already been fully received by a prior receipt, eConnect will refuse to receive additional quantities on that PO line. So if you have a PO line with a quantity of 20, and 20 have been received, but you need to receive 4 more, what do you do?
Well, I learned that eConnect will not allow you to adjust the quantity of a PO line that has been received. And it will also apparently not allow you to add a new line to an existing PO. So that just leaves...creating a new PO to receive the excess quantity.
You still with me?
So now that I've created the new PO, I need to pair it up with the receipt line. And then I need to write the new PO number back to the Excel file so that it can be traced or researched if necessary.
Oh, did I mention that there are POs that were created before the inventory item records were setup in GP? So not only do some PO lines have an "incorrect" item type status, but the corresponding vendor item record is not yet setup. So I need to "fix" those PO lines, and create a vendor item record.
And the list actually goes on and on from there--seriously, there really is alot more, but I assume you get the point. What at first seemed like a relatively simple endeavor ended up having requirements that have left me in amazement. Nearly every issue that could exist with the data and receipt process did exist.
I will no longer look at purchasing receipts the same ever again.
Which leads me to the general topic of designing and estimating custom eConnect integrations, which I'll discuss in another post: "The 90/10 Rule: 90 Percent Source Data, 10 Percent eConnect".
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional in Los Angeles. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Friday, May 13, 2011
The Ultimate Microsoft Dynamics GP Search Tool - Part 1: Introducing Search Master
If you ask several Dynamics GP users what their top concerns, wish list items, or frustrations are with Dynamics GP, I would bet that "searching" or "search" would not be a top item, and may not even be on the list. Maybe the general concept of search in Dynamics GP would be included, such as the ability to produce a report showing specific data, or being able to "find" certain data. But based on my experience, I speculate that "search" is not a common topic or feature request among many Dynamics GP customers.
A little over a year ago, I was chatting with a friend who implements a hosted web-based "SaaS" ERP package that competes with Dynamics GP. He asked me if Dynamics GP had a "global search" feature. I replied, "A what?"
Honestly, at the time, the concept was foreign to me. I tried to think about what that would look like in Dynamics GP, but had a hard time envisioning what value it might provide. I thought that if a GP user wanted to find a vendor, they could use an inquiry window or SmartList or some other existing window or feature to find that vendor. I didn't think that search was necessarily a "problem" that needed to be solved in Dynamics GP.
My friend then gave me a demo of his ERP package and showed me some examples of how the global search feature worked. Only when I saw the global search feature did it finally dawn on me how powerful and useful such a feature could be in Dynamics GP, or in any ERP application.
I then spoke with my colleague Andrew Dean at Envisage Software, who developed the fantastic Post Master for Dynamics GP product. When I presented the concept to him, he instantly saw the potential and began working on designing a new product.
After many months of research, hard work, and testing, we are proud to present Search Master for Dynamics GP, which we believe is the ultimate search tool for Microsoft Dynamics GP.
Search Master is a .NET client application that allows a GP user to search for any value, in any Dynamics GP company database, and then instantly displays the results. No waiting, no browser windows, no long running queries. And there is no need to tell Search Master whether you are searching for a vendor, an inventory item, a customer, or a sales transaction--it will instantly find every Dynamics GP record that contains the specified keywords.
Results are grouped by database and by category, allowing the user to quickly find the data they are looking for. With a single click on any search result, Search Master will open the appropriate Dynamics GP inquiry or edit window and take the user directly to the record.
Rather than continuing to discuss it, I'll point you to our brief video demonstrating how Search Master works and the power of a instant, global search tool for Dynamics GP. The video window below is pretty small, so I recommend clicking on the video or on the link below to view in a larger window on YouTube.
We believe that Search Master will redefine how you interact with Dynamics GP and how think about accessing your Dynamics GP data.
Rather than clicking on a menu, opening a window, switching to a Navigation Pane, using SmartLists or performing a lookup to find a record, you can instead use Search Master to instantly find any record, transaction, or data in Dynamics GP, and then jump directly to the record in GP with a single click. And since Search Master can find and display data in any Dynamics GP company database, users no longer need to worry about what company they are currently using in GP--search results can include records from all company databases. All instantly.
Several years ago, I had a Dynamics GP client with 22 company databases. Because they had a centralized accounting department, the accounting staff was constantly having to switch between GP companies to enter and look for data. If the AP department received a phone call from a vendor, they sometimes didn't know which of the 22 companies the vendor was associated with--it could be one, or it could be 5, so finding a specific invoice or check was a time consuming chore, all while the vendor was on the phone waiting. I was able to create some multi-company SmartLists, but it was alot of work, required numerous custom views and maintenance scripts, and required the user to navigate to SmartLists, click several times to add a filter to find what they were looking for, and then potentially switch GP companies to locate the vendor transactions.
Search Master solves this challenge completely. An AP clerk can now type just a few letters of the vendor name and instantly find the vendor and every transaction related to that vendor, even if they exist in multiple company databases. With a single click, Search Master will switch to the appropriate company database and display the desired information.
Besides eliminating navigation and mouse clicks, one of the most enjoyable features of Search Master is that it returns results instantly. As soon as you type a word or query, you see the results. There is not "Submit" button, there are no web pages to load, there are no time consuming queries, and there is zero waiting for search results. Searching is performed in real time without any delays. Think about how often Dynamics GP users use menus, navigation panes, lists, SmartLists, inquiries, navigation buttons, and lookups to find specific data--and then think about eliminating all of that wasted time.
Andrew Dean and I will be at the Envisage Software Solutions booth at the Decisions Spring 2011 Virtual Conference Dynamics GP Day on June 15th to present Search Master, perform demos, and answer any questions. Please make sure to sign up for the conference and visit our booth!
If you think that Search Master would be valuable for your company or one of your Dynamics GP clients, please contact us. I can be reached at steve@precipioservices.com, or via telephone at (949) 735-4640, or you can visit the Search Master web site.
In Part 2 of this series, I'll discuss Search Master's powerful query syntax that allows you to search for much more than just basic keywords.
A little over a year ago, I was chatting with a friend who implements a hosted web-based "SaaS" ERP package that competes with Dynamics GP. He asked me if Dynamics GP had a "global search" feature. I replied, "A what?"
Honestly, at the time, the concept was foreign to me. I tried to think about what that would look like in Dynamics GP, but had a hard time envisioning what value it might provide. I thought that if a GP user wanted to find a vendor, they could use an inquiry window or SmartList or some other existing window or feature to find that vendor. I didn't think that search was necessarily a "problem" that needed to be solved in Dynamics GP.
My friend then gave me a demo of his ERP package and showed me some examples of how the global search feature worked. Only when I saw the global search feature did it finally dawn on me how powerful and useful such a feature could be in Dynamics GP, or in any ERP application.
I then spoke with my colleague Andrew Dean at Envisage Software, who developed the fantastic Post Master for Dynamics GP product. When I presented the concept to him, he instantly saw the potential and began working on designing a new product.
After many months of research, hard work, and testing, we are proud to present Search Master for Dynamics GP, which we believe is the ultimate search tool for Microsoft Dynamics GP.
Search Master is a .NET client application that allows a GP user to search for any value, in any Dynamics GP company database, and then instantly displays the results. No waiting, no browser windows, no long running queries. And there is no need to tell Search Master whether you are searching for a vendor, an inventory item, a customer, or a sales transaction--it will instantly find every Dynamics GP record that contains the specified keywords.
Results are grouped by database and by category, allowing the user to quickly find the data they are looking for. With a single click on any search result, Search Master will open the appropriate Dynamics GP inquiry or edit window and take the user directly to the record.
We believe that Search Master will redefine how you interact with Dynamics GP and how think about accessing your Dynamics GP data.
Rather than clicking on a menu, opening a window, switching to a Navigation Pane, using SmartLists or performing a lookup to find a record, you can instead use Search Master to instantly find any record, transaction, or data in Dynamics GP, and then jump directly to the record in GP with a single click. And since Search Master can find and display data in any Dynamics GP company database, users no longer need to worry about what company they are currently using in GP--search results can include records from all company databases. All instantly.
Several years ago, I had a Dynamics GP client with 22 company databases. Because they had a centralized accounting department, the accounting staff was constantly having to switch between GP companies to enter and look for data. If the AP department received a phone call from a vendor, they sometimes didn't know which of the 22 companies the vendor was associated with--it could be one, or it could be 5, so finding a specific invoice or check was a time consuming chore, all while the vendor was on the phone waiting. I was able to create some multi-company SmartLists, but it was alot of work, required numerous custom views and maintenance scripts, and required the user to navigate to SmartLists, click several times to add a filter to find what they were looking for, and then potentially switch GP companies to locate the vendor transactions.
Search Master solves this challenge completely. An AP clerk can now type just a few letters of the vendor name and instantly find the vendor and every transaction related to that vendor, even if they exist in multiple company databases. With a single click, Search Master will switch to the appropriate company database and display the desired information.
Besides eliminating navigation and mouse clicks, one of the most enjoyable features of Search Master is that it returns results instantly. As soon as you type a word or query, you see the results. There is not "Submit" button, there are no web pages to load, there are no time consuming queries, and there is zero waiting for search results. Searching is performed in real time without any delays. Think about how often Dynamics GP users use menus, navigation panes, lists, SmartLists, inquiries, navigation buttons, and lookups to find specific data--and then think about eliminating all of that wasted time.
Andrew Dean and I will be at the Envisage Software Solutions booth at the Decisions Spring 2011 Virtual Conference Dynamics GP Day on June 15th to present Search Master, perform demos, and answer any questions. Please make sure to sign up for the conference and visit our booth!
If you think that Search Master would be valuable for your company or one of your Dynamics GP clients, please contact us. I can be reached at steve@precipioservices.com, or via telephone at (949) 735-4640, or you can visit the Search Master web site.
In Part 2 of this series, I'll discuss Search Master's powerful query syntax that allows you to search for much more than just basic keywords.
Friday, May 6, 2011
How do you backup your HyperV virtual machines?
By Steve Endow
NOTE: This is an old post, and I have since purchased Veeam Backup & Replication for Hyper-V. It is so much more effective, reliable, and feature filled than the process below that I would strongly recommend not wasting your time on scripting and invest the time in evaluating Hyper-V backup solutions.
Here is my evaluation of two Hyper-V backup solutions:
http://dynamicsgpland.blogspot.com/2015/02/how-do-you-backup-your-hyper-v-virtual.html
/////////////////////////////////////
Backups are one of the least glamorous IT tasks, but fortunately improvements in technology, software, and the abundance of inexpensive disk space has made them a lot easier.
I currently use SugarSync to backup all of my frequently used files, such as client files and zipped source code, which also syncs those files between my laptop, desktop, and iPhone. I also use Carbonite as an additional measure to back up all of those same files, plus all of my large files, such as Outlook PST files, database backups, and MP3 files, etc.
But those backups seem trivial and quaint compared to the challenge of backing up virtual machines. Despite my occasional effort to shrink the VHD files, a quick glance tells me that most of my VHD files are in the 20GB - 30GB range. While I have enough disk space for those files, backing up such large files is still a time consuming and space hogging endeavor. Copying such a large file to a backup drive or server takes quite a long time, and if you want to maintain multiple backups, the copies take up a lot of disk space.
I'm assuming there are some fancy backup solutions that can backup HyperV VMs (I've never bothered to check), but when it comes to backups, I avoid fancy, and prefer instead to stick with simple, basic, trusted tools.
In the past, I've manually saved or shut down my VMs and then manually used WinRAR to create a RAR backup file on my NAS storage device. This worked, but was manual and took forever. I would try and do the backups in the evening before going to bed, but never did them consistently or on any set schedule. I've heard that Volume Shadow Copy Service (VSS) and even Windows Backup can supposedly backup running HyperVs, but I'm not a fan of any of the versions of Windows Backup, and am fine taking the additional step of saving my VMs prior to backup to be on the safe side.
After switching over to 7Zip and seeing how much better the compression could be, I finally wrote some batch files to automatically zip the VHDs to 7z files with a date time stamp. You can use WinRAR or WinZip if you prefer, but given how rarely I need to recover a VHD backup, I want the best compression I can get. While this 7z batch file worked well, I still had to manually save or shutdown the virtual machine.
Only recently did I finally take the time to track down the VBS scripts that would automatically save my VMs. With this final piece, I have automated the backups of my virtual machines. It's not glamorous, and I am sure that there "better" approaches, but this is my first pass, is simple, and was fairly easy to do. I know that PowerShell offers alot of promise for this type of task, but seriously, have you seen PowerShell scripts? It may look like C, but the last thing I have time to do is learn yet another proprietary scripting language and yet another object model that I won't use regularly, so I've ruled out PowerShell for the time being.
So here are my current backup scripts and steps, likely subject to change as I use it more and perhaps learn of other techniques. I'm interested in hearing how others deal with this challenge. At the moment, I speculate that many businesses do not have any backup strategy or solution for regularly backing up their virtual machines.
Step 1: Use a VBS script to save the VM. Unfortunately, it seems that VBS only has the ability to Stop, Start, or Save HyperV VMs, and does not have the ability to "Shut down" the guest OS like PowerShell does. But from my experience, Save is adequate for backing up HyperV vhd files. I found this script on some web site and modified it slightly to meet my needs--I just use a fixed VM name rather than accepting a command line parameter like the original script. In these examples, I'm backing up my VM called "GPDEV1".
Step 2: Once the VM is saved, it can be backed up. To save space on my backup drive, I prefer to zip the VHD files. My NAS device is relatively slow, so the time difference of copy vs. zip + copy is negligible for me.
This script will save the 7z file with a date prefix. If you want to keep multiple backups, such as several monthly backups, this one is handy. The "V:" drive is my mapped NAS drive.
FOR /F "TOKENS=1* DELIMS= " %%A IN ('DATE/T') DO SET XDate=%%B
For /f "tokens=2-4 delims=/ " %%a in ('date /t') do (set XDate=%%c-%%a-%%b)
7z a -t7z "V:\GPDEV1\%XDate% GPDEV1.7z" "E:\HyperVServers\GPDEV1\GPDEV1.vhd"
This script just saves the 7z file with the same name every time. I currently use this for my weekly backups.
7z a -t7z "V:\GPDEV1\GPDEV1 Weekly Backup.7z" "E:\HyperVServers\GPDEV1\GPDEV1.vhd"
Step 3: Create a Windows Task Scheduler task to save the VMs. Note that Task Scheduler does not appear to run VBS files directly, so you will need to create a batch file that calls the CScript engine. SaveGPDEV1.vbs is the file containing the VBS script listed above.
Note that you must include the full path to your VBS file in your batch file, otherwise the command will fail if run by Task Scheduler (since it will assume a path of C:\Windows\System32\ for the vbs).
Step 4: As a second step in your new task, call the backup batch file that zips and/or copies the VHD file to your backup drive.
I currently have a weekly backup job that backs up a few VMs on Wednesday night at 11pm, and then another job that backs up a few more VMs on Friday night at 11pm. Those 7z files retain the same name and are updated each week.
I'll probably add a few more tasks that perform monthly backups with a date stamp in the file name.
This backup approach won't win any awards for cutting edge technology, but it was simple enough for me to pull together with minimal time investment, and I'm pretty confident in its reliability.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
NOTE: This is an old post, and I have since purchased Veeam Backup & Replication for Hyper-V. It is so much more effective, reliable, and feature filled than the process below that I would strongly recommend not wasting your time on scripting and invest the time in evaluating Hyper-V backup solutions.
Here is my evaluation of two Hyper-V backup solutions:
http://dynamicsgpland.blogspot.com/2015/02/how-do-you-backup-your-hyper-v-virtual.html
/////////////////////////////////////
Backups are one of the least glamorous IT tasks, but fortunately improvements in technology, software, and the abundance of inexpensive disk space has made them a lot easier.
I currently use SugarSync to backup all of my frequently used files, such as client files and zipped source code, which also syncs those files between my laptop, desktop, and iPhone. I also use Carbonite as an additional measure to back up all of those same files, plus all of my large files, such as Outlook PST files, database backups, and MP3 files, etc.
But those backups seem trivial and quaint compared to the challenge of backing up virtual machines. Despite my occasional effort to shrink the VHD files, a quick glance tells me that most of my VHD files are in the 20GB - 30GB range. While I have enough disk space for those files, backing up such large files is still a time consuming and space hogging endeavor. Copying such a large file to a backup drive or server takes quite a long time, and if you want to maintain multiple backups, the copies take up a lot of disk space.
I'm assuming there are some fancy backup solutions that can backup HyperV VMs (I've never bothered to check), but when it comes to backups, I avoid fancy, and prefer instead to stick with simple, basic, trusted tools.
In the past, I've manually saved or shut down my VMs and then manually used WinRAR to create a RAR backup file on my NAS storage device. This worked, but was manual and took forever. I would try and do the backups in the evening before going to bed, but never did them consistently or on any set schedule. I've heard that Volume Shadow Copy Service (VSS) and even Windows Backup can supposedly backup running HyperVs, but I'm not a fan of any of the versions of Windows Backup, and am fine taking the additional step of saving my VMs prior to backup to be on the safe side.
After switching over to 7Zip and seeing how much better the compression could be, I finally wrote some batch files to automatically zip the VHDs to 7z files with a date time stamp. You can use WinRAR or WinZip if you prefer, but given how rarely I need to recover a VHD backup, I want the best compression I can get. While this 7z batch file worked well, I still had to manually save or shutdown the virtual machine.
Only recently did I finally take the time to track down the VBS scripts that would automatically save my VMs. With this final piece, I have automated the backups of my virtual machines. It's not glamorous, and I am sure that there "better" approaches, but this is my first pass, is simple, and was fairly easy to do. I know that PowerShell offers alot of promise for this type of task, but seriously, have you seen PowerShell scripts? It may look like C, but the last thing I have time to do is learn yet another proprietary scripting language and yet another object model that I won't use regularly, so I've ruled out PowerShell for the time being.
So here are my current backup scripts and steps, likely subject to change as I use it more and perhaps learn of other techniques. I'm interested in hearing how others deal with this challenge. At the moment, I speculate that many businesses do not have any backup strategy or solution for regularly backing up their virtual machines.
Step 1: Use a VBS script to save the VM. Unfortunately, it seems that VBS only has the ability to Stop, Start, or Save HyperV VMs, and does not have the ability to "Shut down" the guest OS like PowerShell does. But from my experience, Save is adequate for backing up HyperV vhd files. I found this script on some web site and modified it slightly to meet my needs--I just use a fixed VM name rather than accepting a command line parameter like the original script. In these examples, I'm backing up my VM called "GPDEV1".
Option Explicit
Dim arg, targetVM, WMIService, VMs, InputKey
targetVM = "GPDEV1"
If Right((LCase(WScript.FullName)),11) <> "cscript.exe" then
WScript.Echo "Use cscript.exe"
WScript.Quit
End if
Set WMIService = GetObject("winmgmts:\\.\root\virtualization")
Set VMs = WMIService.ExecQuery("SELECT * FROM Msvm_ComputerSystem WHERE ElementName='" & targetVM & "'")
Select Case VMs.ItemIndex(0).EnabledState
Case 2
WScript.StdOut.Write targetVM & " is saving."
VMs.ItemIndex(0).RequestStateChange(32769)
Do while VMs.ItemIndex(0).EnabledState <> 32769
Set VMs = WMIService.ExecQuery("SELECT * FROM Msvm_ComputerSystem WHERE ElementName='" & targetVM & "'")
WScript.StdOut.Write(".")
WScript.sleep 1000
Loop
WScript.Echo "Completed."
Case Else
WScript.Echo targetVM & " is not running."
End Select
Dim arg, targetVM, WMIService, VMs, InputKey
targetVM = "GPDEV1"
If Right((LCase(WScript.FullName)),11) <> "cscript.exe" then
WScript.Echo "Use cscript.exe"
WScript.Quit
End if
Set WMIService = GetObject("winmgmts:\\.\root\virtualization")
Set VMs = WMIService.ExecQuery("SELECT * FROM Msvm_ComputerSystem WHERE ElementName='" & targetVM & "'")
Select Case VMs.ItemIndex(0).EnabledState
Case 2
WScript.StdOut.Write targetVM & " is saving."
VMs.ItemIndex(0).RequestStateChange(32769)
Do while VMs.ItemIndex(0).EnabledState <> 32769
Set VMs = WMIService.ExecQuery("SELECT * FROM Msvm_ComputerSystem WHERE ElementName='" & targetVM & "'")
WScript.StdOut.Write(".")
WScript.sleep 1000
Loop
WScript.Echo "Completed."
Case Else
WScript.Echo targetVM & " is not running."
End Select
Step 2: Once the VM is saved, it can be backed up. To save space on my backup drive, I prefer to zip the VHD files. My NAS device is relatively slow, so the time difference of copy vs. zip + copy is negligible for me.
This script will save the 7z file with a date prefix. If you want to keep multiple backups, such as several monthly backups, this one is handy. The "V:" drive is my mapped NAS drive.
FOR /F "TOKENS=1* DELIMS= " %%A IN ('DATE/T') DO SET XDate=%%B
For /f "tokens=2-4 delims=/ " %%a in ('date /t') do (set XDate=%%c-%%a-%%b)
7z a -t7z "V:\GPDEV1\%XDate% GPDEV1.7z" "E:\HyperVServers\GPDEV1\GPDEV1.vhd"
This script just saves the 7z file with the same name every time. I currently use this for my weekly backups.
7z a -t7z "V:\GPDEV1\GPDEV1 Weekly Backup.7z" "E:\HyperVServers\GPDEV1\GPDEV1.vhd"
Step 3: Create a Windows Task Scheduler task to save the VMs. Note that Task Scheduler does not appear to run VBS files directly, so you will need to create a batch file that calls the CScript engine. SaveGPDEV1.vbs is the file containing the VBS script listed above.
cscript C:\Scripts\SaveGPDEV1.vbs
Note that you must include the full path to your VBS file in your batch file, otherwise the command will fail if run by Task Scheduler (since it will assume a path of C:\Windows\System32\ for the vbs).
Step 4: As a second step in your new task, call the backup batch file that zips and/or copies the VHD file to your backup drive.
I currently have a weekly backup job that backs up a few VMs on Wednesday night at 11pm, and then another job that backs up a few more VMs on Friday night at 11pm. Those 7z files retain the same name and are updated each week.
I'll probably add a few more tasks that perform monthly backups with a date stamp in the file name.
This backup approach won't win any awards for cutting edge technology, but it was simple enough for me to pull together with minimal time investment, and I'm pretty confident in its reliability.
Steve Endow is a Dynamics GP Certified Trainer and Dynamics GP Certified Professional. He is also the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.
http://www.precipioservices.com
Payroll and Tips (Run Screaming Away, NOW!)
Okay, so I rank payroll and tips up there with sales tax and analytical accounting in my list of "not so favorite" things to deal with. But I did learn an interesting tidbit about how tips are impacted by tax sheltered (TSA) deductions in Dynamics GP. Take the following example:
Adam Smith (Setup)
Now, here is where it gets interesting. Let's say that when the deduction (HEALTH) was set up (Setup>>Payroll>>Deduction), it was left as the default Based on Pay Codes- All (noted in screenshot below).
Huh? Well, because you said that the deduction is based on all pay codes (including the TIPS code), then GP prorates the tax sheltered deduction across all pay codes (wages and tips). So, in this example, tips accounted for 25% of the gross and wages were 75%. So the $10 deduction was allocated 25% ($2.50) to reduce the taxable tips, and 75% ($7.50) to reduce the taxable wages.
Adam Smith (Setup)
- Pay Code-HOUR
- $15/hr
- Pay Code- TIPS
- Reported Tips
- Deduction- HEALTH
- $10/fixed amount per pay period
- Tax Sheltered from Federal and FICA
- Gross Pay
- HOUR- $150
- TIPS- $50
- Deductions
- HEALTH- $10
Now, here is where it gets interesting. Let's say that when the deduction (HEALTH) was set up (Setup>>Payroll>>Deduction), it was left as the default Based on Pay Codes- All (noted in screenshot below).
Here is what happens when you review a payroll summary for the employee:
- Gross Pay $200
- Federal Wages $142.50
- Reported Tips $50
- Federal Tips $47.50
Easy enough to change if you don't want it to reduce tips, simply change the Based on Pay Codes setting for the deduction to Selected, and insert all pay codes except the tips pay codes. Viola! It will only prorate across the included pay codes.
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers.
Implementation Trauma Continued
I promised a couple of weeks ago that I would dive deeper in to my top five implementation "sins" for both customers and consultants. I am a bit behind obviously, but let's take a look at #1 on both lists:
For consultants: Assuming you are the sole reason for the success/failure of the project
For customers: Assuming that the consulting team is the sole reason for the success/failure of the project
What is wrong with taking sole responsibility for a project? Anyone? Anyone? Well, there are number of issues with this approach from both the consultant and customer's perspective. But, let's start with a story about Suzy. Suzy is a fabulous consultant, and the customer loves her. Suzy takes care of them, completing many complicated tasks on her own without their input. She even completed much of the configuration without their assistance, after all, she knows their business so well. In many ways, the system seems to magically configure and take of itself because Suzy is so efficient. In this way, Suzy has treated the implementation as if it "her system", fixing and configuring it as needed.
Although Suzy should be applauded for cultivating such a great relationship with the client, we have to question the wisdom of Suzy taking full responsibility for the implementation-- making decisions, completing configuration tasks, and resolving issues with little to no customer input. And, in this case, the customer loves her for it. However, does this best serve the customer and the implementation in the long run? Absolutely not, and here's why:
As consultants, we need to ensure that the customer is in a position to take responsibility for the success of the implementation with our assistance. We need to give them the tools, the knowledge, and the resources to do this well. For this reason, I take the title "consultant" seriously. I like to think we are not hired to complete a list of specific tasks, but rather to "consult" the customer in how best to complete that list.
As customers, we want a system that works for us, that our employees feel is beneficial, and that grows with us over time. To achieve these goals, we must invest early and intensely (in resources, time, and effort) in the success of the project. To trust any single person, no matter how qualified or competent, is to limit the opportunity for success. Find your trusted consultants and let them "consult" you on how to reach these goals.
More on #2 on the lists in the coming days :)
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers.
For consultants: Assuming you are the sole reason for the success/failure of the project
For customers: Assuming that the consulting team is the sole reason for the success/failure of the project
What is wrong with taking sole responsibility for a project? Anyone? Anyone? Well, there are number of issues with this approach from both the consultant and customer's perspective. But, let's start with a story about Suzy. Suzy is a fabulous consultant, and the customer loves her. Suzy takes care of them, completing many complicated tasks on her own without their input. She even completed much of the configuration without their assistance, after all, she knows their business so well. In many ways, the system seems to magically configure and take of itself because Suzy is so efficient. In this way, Suzy has treated the implementation as if it "her system", fixing and configuring it as needed.
Although Suzy should be applauded for cultivating such a great relationship with the client, we have to question the wisdom of Suzy taking full responsibility for the implementation-- making decisions, completing configuration tasks, and resolving issues with little to no customer input. And, in this case, the customer loves her for it. However, does this best serve the customer and the implementation in the long run? Absolutely not, and here's why:
- As much as the customer loves Suzy, she is not their employee. She does not work in the business on a daily basis, nor will she be around after the implementation is complete. This inherently limits her knowledge of their business, her control over the situation, and her ability to motivate adoption within the ranks. Customer investment (in terms of resources) is critical to the project's success. In the end, the system belongs to the customer, not to Suzy.
- Suzy is taking responsibilities and tasks away from customer resources, and reducing their opportunity to become involved in the implementation. If customer resources are given only limited opportunities to participate in the decisions and tasks related to the implementation, a significant opportunity to create excitement about the project is lost. Without internal advocacy, projects fail.
- With all that Suzy takes care of, when she leaves (as they always do) the system can become "difficult" and "confusing" if there has not been the almost-constant involvement of customer resources in the decision making and configuration process. From day one, customers must cultivate their own "experts", learning as much as they can from Suzy before she leaves. Suzy made so many decisions, fixed so many things, and no one is left with that knowledge or understanding once she moves on to another project. As much as a consultant brings expertise in the software (and hopefully, industry) to the project, it is the customer's perspective that ensures that decisions and processes are owned and supported by the appropriate internal personnel.
As consultants, we need to ensure that the customer is in a position to take responsibility for the success of the implementation with our assistance. We need to give them the tools, the knowledge, and the resources to do this well. For this reason, I take the title "consultant" seriously. I like to think we are not hired to complete a list of specific tasks, but rather to "consult" the customer in how best to complete that list.
As customers, we want a system that works for us, that our employees feel is beneficial, and that grows with us over time. To achieve these goals, we must invest early and intensely (in resources, time, and effort) in the success of the project. To trust any single person, no matter how qualified or competent, is to limit the opportunity for success. Find your trusted consultants and let them "consult" you on how to reach these goals.
More on #2 on the lists in the coming days :)
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a supervising consultant with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers.
Subscribe to:
Posts (Atom)