Friday, May 18, 2018

Get Next Dynamics GP SOP Transaction Number Using SQL

By Steve Endow

As I mentioned in a recent post, some times it's better to use SQL to retrieve the next document number from Dynamics GP.

I am currently working on an eConnect SOP Order import and needed to retrieve the next order number from GP.

Here is my C# code for retrieving the next SOP number from GP using SQL.



The SQL stored procedure is called taGetSopNumber and you need to provide it with the SOP Type and the Doc Type ID of your desired transaction.


public string GetNextSOPNumber(int sopType, string docTypeID)
{
    //SOP Type:  
    //1 = Quote
    //2 = Order
    //3 = Invoice
    //4 = Return
    //5 = Back Order
    //6 = Fulfillment Order

    try
    {
        SqlConnection gpConn = ConnectionGP();

        string commandText = "taGetSopNumber";

        SqlParameter[] sqlParameters = new SqlParameter[5];
        sqlParameters[0] = new SqlParameter("@I_tSOPTYPE", System.Data.SqlDbType.TinyInt);
        sqlParameters[0].Value = sopType;
        sqlParameters[1] = new SqlParameter("@I_cDOCID", System.Data.SqlDbType.Char, 15);
        sqlParameters[1].Value = docTypeID;
        sqlParameters[2] = new SqlParameter("@I_tInc_Dec", System.Data.SqlDbType.TinyInt);
        sqlParameters[2].Value = 1;
        sqlParameters[3] = new SqlParameter("@O_vSopNumber", System.Data.SqlDbType.Char, 21);
        sqlParameters[3].Value = string.Empty;
        sqlParameters[3].Direction = ParameterDirection.Output;
        sqlParameters[4] = new SqlParameter("@O_iErrorState", System.Data.SqlDbType.Int);
        sqlParameters[4].Direction = ParameterDirection.Output;
        sqlParameters[4].Value = 0;

        string nextNum = string.Empty;
        int recordCount = DataAccess.ExecuteNonQuery(gpConn, Controller.Instance.Model.GPDatabase, CommandType.StoredProcedure, commandText, sqlParameters);
        nextNum = sqlParameters[3].Value.ToString().Trim();

        return nextNum;
    }
    catch (Exception ex)
    {
        Log.Write("An unexpected error occurred in GetNextSOPNumber: " + ex.Message, true);
        return string.Empty;
    }
}





You can also find him on Twitter, YouTube, and Google+





Friday, May 4, 2018

Consulting is never boring!

By Steve Endow

Today I had to switch between several tasks, and during one of those task switches, my brain put on the brakes.


My brain:  This is crazy!

Me:  What is crazy?

Brain:  This!  This is crazy!  Switching from SQL queries to Dynamics GP VS Tools to an ASP.NET Core web API for Dynamics GP to working with multiple Azure services.  And that's just in the last hour!  It's nuts!

Me:  Uh, hello, we do this every day. So that you're not bored, remember?

Brain:  Dude, that doesn't make it any less crazy.

Me:  Noted.  I'll blog about it just to make you feel better.


If I actually stop for a moment, step back, and look at all of the things I do, all of the tools I use, and all of the things I have to know and understand to do my job, it is kinda crazy.

If you're modest, you might think that this is fairly normal, which in some respects it is--lots of people probably do what you do in the consulting world.  But if you want to really appreciate how much you really know, try hiring a 20 year old intern and give them a few small projects.  You'll quickly realize that the "simple" task you gave the intern requires tons of fundamental knowledge that informs how to perform the task.  It probably took you years to develop that fundamental knowledge, and then many more years on top of that to develop competence or mastery.



Let's start with SQL.  In the Dynamics GP consulting world, a basic understanding of SQL Server and T-SQL is pretty much essential. 

"Hey intern, can you run this query?"

"What's a query?"

"It's a way to get data out of SQL Server."

"SQL Server?"

"Yes, SQL Server is a relational database."

"Relational?"

"Nevermind, just launch Management Studio and connect to the GP SQL instance"

"GP SQL instance?"


These steps may seems obvious, and is probably invisible to you if you've been doing it for years, but every single step requires an entire fundamental skill stack to perform even a basic task.

So you need to know how to work with Management Studio.  How to connect to a SQL instance.  How to write some T-SQL.  It's a good to understand SQL databases, tables, stored procedures, and views.  Maybe triggers and cursors if you're daring.  And how about backups and transaction logs and Recovery Model, just in case there's a problem?

If you're on the bleeding edge, you'll know how to backup SQL Server databases to Azure.  Which means you should be familiar with SQL jobs and Azure Storage and backup compression.  And Azure is an entire universe of knowledge.

But back to Dynamics GP.  How about SET files and dictionaries and chunk files and shared dictionaries and modified forms and reports and AddIns and Modifier & VBA?  And there's all the knowledge around GL, AP, AR, SOP, POP, and IV, not to mention the other ancillary modules like AA, PA, FA, MC, CM, IC, HR, UPR, and others.  You know that one checkbox under Tools -> Setup -> Posting -> Posting?  Ya, that one that affects whether transaction posting hits the GL?  Or what about that option in the SOP Type ID that affects inventory allocation and quantity overrides?  Or the hundreds of other options you kinda need to be aware of?

And naturally, since you're working with an accounting system, it's good to understand debits vs credits and income statement vs balance sheet and cash vs income vs expenses vs assets vs liabilities.  And if you're into reporting, there's the entire universe of standard reporting tools and financial reporting tools.

In my particular line of work, I also need to understanding everything from Excel macros to VBA to VB Script to Integration Manager to eConnect to SmartConnect.  I need to know how to use .NET 2.5 through .NET Core 2.0 using Visual Studio 2010 through 2017.  I need to thoroughly understand IIS and Kestrel, TCP/IP, ports, firewalls, DNS, HTTPS, TLS, SSH, and nmap.  I need to know HMAC, AES, and SHA and have a fairly good understanding encryption.

I need to be able to glance at XML and JSON and quickly find data issues.  I need to know what HTTP verbs and response codes mean, as well as what "idempotent" means (that's actually a word).  I need to understand TXT and CSV parsing and the issues related to using Excel files as data sources.  I need to be able to review thousands of entries in a log file and figure out why two identical requests were processed 3 milliseconds apart, and that's only after I figure out how to reliably log activity with millisecond precision.

I need to understand PCI compliance and how to call credit card gateway APIs for CC and ACH tokenization and transaction processing.  And then there's the TLS 1.2 upgrade saga--don't get me started on that one.

And while writing some complex queries years ago, I needed to figure out why they were taking hours to run.  So I had to give myself a crash course SQL query optimization so that I didn't kill the SQL Server.  Which led to me developing a subspeciality in amateur SQL Server optimization, which can be quite challenging in the Dynamics GP world.  And if you're dealing with GP performance, it's helpful to understand virtualization and be familiar with Hyper-V and VMWare and how VM memory settings affect SQL Server.

And the list goes on and on.  It's a really, really long list of stuff you need to learn and know and understand and use on a regular basis.


It's kinda crazy.

But that's also why I like it. 




You can also find him on Twitter, YouTube, and Google+





Wednesday, May 2, 2018

Get Next Dynamics GP RM Payment Number Using SQL

By Steve Endow

When you import transactions into Dynamics GP, you often need to get the next transaction number or document number from Dynamics GP.

In some simple cases, you can leave the document number field blank and let eConnect get the next number for you, but if you are sending in distributions or Analytical Accounting data for a transaction, you need to assign a document number to those elements before sending the transaction off to eConnect.

eConnect does have a method to generate the next number, but there's a big catch: it requires Windows authentication to connect to SQL and get the next document number.  This works for some situations where you will be using Windows Authentication for your integration, but I have many situations where only a SQL or GP login will be available.

In those cases, you can usually directly call the underlying eConnect stored procedures.  The problem with this approach is figuring out which stored procedure to call and how to call it.  You'd be surprised how challenging this can be, and every time I have to do it, I have to go find some old code because I can't seem to find the correct eConnect stored proc.

Case in point is the process for generating the next RM payment (cash receipt) number.  I looked and looked for the eConnect taRM procedure, but couldn't find it.  Why?  Because it's inconsistently named "taGetPaymentNumber".  Ugh.


I couldn't find anything via Google on this, so, to document this lovely process, here is my C# code for getting the next RM Payment Number using the eConnect stored procedure.


        public static bool GetNextRMPaymentNumber(string gpDatabase, ref string nextPayment)
        {
            //eConnect method, which uses Windows auth
            //Microsoft.Dynamics.GP.eConnect.GetNextDocNumbers nextDoc = new Microsoft.Dynamics.GP.eConnect.GetNextDocNumbers();
            //string nextRMPayment = nextDoc.GetNextRMNumber(Microsoft.Dynamics.GP.eConnect.IncrementDecrement.Increment, Microsoft.Dynamics.GP.eConnect.RMPaymentType.RMPayments, ConnectionStringWindows(gpDatabase));
            //return nextRMPayment;

            //SQL method
            string commandText = "taGetPaymentNumber";

            SqlParameter[] sqlParameters = new SqlParameter[4];
            sqlParameters[0] = new SqlParameter("@I_vDOCTYPE", System.Data.SqlDbType.TinyInt);
            sqlParameters[0].Value = 9;  //9 = Payment
            sqlParameters[1] = new SqlParameter("@I_vInc_Dec", System.Data.SqlDbType.TinyInt);
            sqlParameters[1].Value = 1;  //1 = Increment
            sqlParameters[2] = new SqlParameter("@O_vDOCNumber", System.Data.SqlDbType.VarChar, 21);
            sqlParameters[2].Direction = ParameterDirection.InputOutput;
            sqlParameters[2].Value = string.Empty;
            sqlParameters[3] = new SqlParameter("@O_iErrorState", System.Data.SqlDbType.Int);
            sqlParameters[3].Direction = ParameterDirection.InputOutput;
            sqlParameters[3].Value = 0;

            int recordCount = DataAccess.ExecuteNonQuery(gpDatabase, CommandType.StoredProcedure, commandText, sqlParameters);

            if (int.Parse(sqlParameters[3].Value.ToString()) == 0)
            {
                if (sqlParameters[2].Value.ToString().Trim() != string.Empty)
                {
                    nextPayment = sqlParameters[2].Value.ToString().Trim();
                    return true;
                }
                else
                {
                    nextPayment = string.Empty;
                    return false;
                }
            }
            else
            {
                return false;
            }

        }


I prefer using this method, as it will work whether I am using Windows Auth or SQL auth.



Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+


Friday, April 20, 2018

The Challenge of Posting Dates with Automated Dynamics GP Imports

By Steve Endow

If you are familiar with Dynamics GP, you are likely familiar with the confusion that can be caused by the "Posting Date" feature.  Many customers have never opened the additional transaction date window in GP.


A customer calls and asks, "Why did my April invoice post to the GL in March? The invoice is clearly dated April 5!"

In addition to the confusion between Document Date and Posting Date, there is also the potential confusion caused by Transaction Posting Date vs. Batch Posting Date.


As if that isn't enough fun related to dates, things can get particularly interesting and challenging with automated integrations with Dynamics GP.

The issue typically comes up during month end.  If an April 15 invoice is posted to April 16, it is usually not an issue and nobody notices.  But if a March 31 invoice is posted to April 1, that can cause issues.

When a user is entering transactions manually in GP, they can review the invoice, know whether it should be posted to March or April, and set the posting date accordingly.  But when an automated integration is importing data, it usually doesn't know which fiscal period a transaction belongs to.  It has to rely upon a data field in the source data to tell it what the posting date should be.

That sounds easy enough, right?

Unfortunately, it isn't always easy.


Above is some sample data from an integration.  A single invoice date is provided in the DOCDATE column.  And a Batch ID of 2018-04-20 is provided, implying that the transactions are related to April 20.  From this information, you could reasonably assume that the transactions should post to the 2018-04 fiscal period.

But what about this sample data.


This morning a concerned and upset customer called me asking "Why did our April invoices post to March??"

The batch ID of "20180401" indicated that these were April invoices and not March invoices.  But as we know, Dynamics GP doesn't care about the batch ID when it comes to posting.  The only date that matters is the Posting Date.

"But we don't import a posting date with our invoices. Only the Invoice (document) date!", the customer responded.

Good point.  The source data only contained DOCDATE, and their SmartConnect map was only setup to import the invoice Document Date field.

So why did all of their invoices in the 20180401 batch get posted to March 31?

Well, as I mentioned above, you have to know whether GP is configured to post using the transaction posting date or the batch posting date.  And to keep things confusing, it is possible to configure some batches to post using the transaction posting date, and have other batches post using a batch posting date.

So using the sample data above, why did the 20180401 batch post to March 31?

When importing transactions using eConnect (or SmartConnect, etc.), if the Batch ID specified for the transaction does not exits, eConnect will create the batch automatically.  You don't need to specify additional options--it will just handle it for you.

And when your Dynamics GP batch type is set to use the Batch Posting Date, guess what eConnect uses as the default value for the Batch Posting Date?  The document date.

So in the above sample data, the first invoice that is imported has a Document Date of March 31.  So eConnect dutifully creates a new batch with a posting date of March 31.  It then imports the invoices into that batch.  And all of the invoices in that batch will post to March 31.  Even if the invoice date is April 1.

Okay, so the customer just needs to fix the March 31 dates, right?

Perhaps it may be that simple.  Maybe there was a bug in their source data.

But what about invoices that are generated on April 1, but related to March?  What about a vendor invoice dated April 2 that is received from an external AP system on April 3, but was for a service performed in March?  An integration won't know the invoice should be posted to March--the source data would have to provide an additional clue, such as a separate Posting Date or Fiscal Period field.

I've only encountered a few customers who were able to supply that fiscal period field separate from the document date field.  In my experience, it is not common for a source system to know the fiscal period for a transaction--most only have a single transaction date.

So when designing a transaction import for Dynamics GP, make sure to consider what happens when transactions are dated the last day of the month or first day of the month, and whether transactions related to a prior fiscal period may show up in your source data.  It can be surprisingly tricky.


Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+







Monday, April 16, 2018

My First SQL Saturday event: It was amazing

By Steve Endow

The Microsoft SQL Server community is amazing.

Amazing.

That's not an exaggeration or platitude.

On Saturday, I attended my first "SQL Saturday" event in Orange County, California.  I left speechless.


Several hundred people attended the event at a local college.  On a Saturday.  I overheard one attendee say that she woke up at 4am and had a 2+ hour drive from San Diego to attend. Presenters flew in from all over the country to speak at the event, with several speakers facing a snow storm and flight cancellations trying to return home.  They did this, without compensation, on a Saturday.  And some were planning on attending up to 10 other SQL Saturday events across the country.

And I should mention that the event was free for attendees.  Completely free.

When I arrived at 8am, there was a line of 40 or 50 people waiting to check in.  There were lots of volunteers helping people check in, handing out tote bags, re-printing passes, setting up tables, and preparing the event.  Before the first session started, they had setup tables with gallons of free coffee, bagels, danishes, and donuts.

The event is organized by PASS, a non-profit organization that helps support people who use Microsoft data technologies.

Ten companies sponsored the SQL Saturday event, which has the following mission statement:

Our Mission
The PASS SQLSaturday program provides the tools and knowledge needed for groups and event leaders to organize and host a free day of training for SQL Server professionals. At the local event level, SQLSaturday events:
  • Encourage increased membership for the local user group
  • Provide local SQL Server professionals with excellent training and networking opportunities
  • Help develop, grow, and encourage new speakers
When I signed up, I didn't know what to expect.  I thought it might be a casual user-group style meeting with a few speakers.  But it was much more like a full fledged, single day, intense SQL Server conference.

Several of the speakers that I saw were simply amazing.

Here are the sessions that I attended:

  1. SQL Database and Query Design for Application Developers
  2. Azure Basics for the DBA
  3. PowerShell for the SQL DBA
  4. Spotlight on SQL Server by Quest Software (vendor presentation)
  5. Data Pages, Allocation Units, IAM Chains
  6. The Query Store and SQL Tuning
  7. Fundamentals That Will Improve Query Performance


The 6 educational sessions were incredible.  I felt I knew a fair amount about 3 of the topics, but still learned a ton in those sessions.  And the 3 sessions with topics that were new to me had so much valuable content that I was dizzy by the time the session ended.  For example, I learned how the data is structured inside of an 8K data page--down to the byte!  WHAT?!?!

I took pages of notes on my iPad during most of the sessions, as they were all offering real world knowledge, experience, anecdotes, and lessons about how to use different SQL Server features and tools.

It was 6 solid hours of high quality content presented by SQL Server experts.  It was intense, valuable learning, and I was tired at the end of the day.

It was amazing.

If you work with SQL Server and have an opportunity to attend a SQL Saturday event, I recommend it.


Steve Endow is a Microsoft MVP in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+




Monday, March 26, 2018

My Typical Password: Is a 25 character minimum passphrase policy possible?

By Steve Endow

If you haven't read my prior post about passwords, perhaps read that first:

https://dynamicsgpland.blogspot.com/2016/10/how-do-you-choose-your-passwords-and.html


My "Passphrase Generator" has been working great since then. It isn't perfect, but has been working well enough for me.

I thought I was doing all the "right" things by using my passphrase generator and using a password manager religiously.

Using words with a max length of 7 characters, 2 numbers, and 1 symbol, I have been generating passphrases like:

Briony%4Cobwebs4    (16 chars)
Hyped/5Umber1    (13 chars)
Reecho%6Touzled8    (16 chars)
Tisanes#4Tangles6    (17 chars)


I considered these pretty safe passwords.

But I recently started listening to Kevin Mitnick's book, The Art of Invisibility, on Audible. In that book, Mitnick recommends that people now use passphrases of at least 25 characters.

25 characters?!?!?  (interrobang)

That's crazy!

But is it?

Those of us who work with Dynamics GP regularly bemoan the 15 character limit for passwords, as many of our customers have encountered issues with this limit. The customer's IT security policy requires a minimum of 15 characters, and they eventually figure out that their 16+ character passwords don't work in Dynamics GP.


So obviously that rules out using 25+ character passwords for Dynamics GP.

But I'm pretty sure I've run into web sites that would not allow me to have anywhere near 25 characters.

There's only one way to find out.  I just reset my password on these web sites.  These weren't the limits of the site, just the length of the long passwords that I randomly generated for each and successfully saved.

Twitter:  35 characters
Stack Overflow:  35 characters
GPUG.com:  38 characters
Atlassian/Bitbucket:  36 characters

Wow, moving right along!  It looks like a 25 character minimum password might be possible!

(play record scratch here)

Then I login to my online banking web site.  Major bank.  Big bank.  Huge bank.  Not a relatively tiny web site like GPUG. 

And what do I see?


20 character max!  What??

Strike one!

Hmmm, let's check another bank web site.  I log in to a smaller bank that I use, but when I try to change the password...and...

...it doesn't allow me to paste in the password from my Passphrase Generator!

That is garbage!

Troy Hunt lays out this entire stupid fake "security" policy of not allowing password pasting in his excellent blog post here.

And I see that he shows examples of GE Capital and PayPal and others.

So that pretty much kills the idea of consistently using 25+ character passwords.

Could I use really long passwords on sites that allow them, and that allow pasting?  Sure.  And I may start doing that.

But clearly there are many sites, particularly the large ones, that have indefensible password length limitations and block the paste function.  So for those, you're limited to their arbitrarily short password lengths.

So I guess that answers my question.

With that said, will I use 30+ character passwords?  Not sure.

Occasionally I have to manually enter the password on a mobile device, and it is a nightmare to try and type that many characters in a password field.  I can barely compose a simple text message on my phone without making a typo, so my password typing success rate is not stellar.

But I may give it a try.  As I reset my passwords going forward, I'll try and use a 25+ character passphrase and see how it goes.

Hopefully some day my bank will allow more than 20 characters.




You can also find him on Twitter, YouTube, and Google+


Tuesday, March 13, 2018

Beware of MIN, MAX, and TOP in Dynamics GP SQL queries!

By Steve Endow

A few weeks ago I started some research to compare the performance of MAX vs. TOP(1) in SQL Server queries.

After finding some unexpected results, I created a second video showing some odd behavior of MAX and TOP on one particular Dynamics GP table.  At that time, I couldn't figure out what was causing the performance issue with the MAX function.

Well, thanks to some very generous help and amazing insight from Kendra Little, I finally have a definitive explanation for the performance issue.


Wednesday, March 7, 2018

SQL MAX vs TOP 1: Part 2! The Revenge of IV30500!

By Steve Endow

I just can't let it go.  I need to know.  I need answers.  I need to solve the mystery.  The riddle.  The enigma.

Why does the MAX function sometimes perform very poorly compared to TOP 1?

I really thought that "MAX vs TOP 1" was a simple question. An easy question.  A spend 10 seconds on Stack Overflow and get the answer type of question.

But I just couldn't leave it alone and had to go and test it for myself.  And open a veritable Pandora's Box of SQL riddles.

In Part 1 of this series, I delved into how MAX and TOP 1 behave in several random queries, and I ended on a query that showed MAX performing quite poorly compared to TOP 1.

After that video, I stumbled across an even simpler query that produced an even more dramatic performance difference, where MAX performed miserably.  But I couldn't figure out why.

In this video, I discuss what I learned about the query and the specific Index Scan that is causing the MAX query to performing so poorly. 


Here's the recap:

When querying some fields, such as the IV30500 POSTEDDT, with no WHERE clause, both MAX and TOP 1 perform virtually the same, with both using a very efficient Index Scan.  50% vs 50% relative costs.



But when I query the TRXSOURCE field, with no WHERE clause, MAX shows 100% relative cost, whereas TOP 1 shows 0% relative cost.



What???

The POSTEDDT query uses an Index Scan.  The TRXSOURCE query uses an Index Scan.  But for some reason with the TRXSOURCE query, MAX is much more costly.

I stared at this result for a few hours trying to figure out why.  I eventually found this tiny little detail.


Notice that the Actual Number of Rows for the MAX Index Scan is 147,316.  That's every row in the table.

By contrast, the TOP 1 Index Scan has Actual Number of Rows = 1.

What is going on?

For some reason, the MAX is having to scan the ENTIRE AK1IV30500 index.  It isn't getting much benefit from the index.

But why?

Unfortunately, I don't yet know.

"Maybe SQL is caching the query execution plan?"

Apparently not.  I tried DBCC FREEPROCCACHE, and saw no change.


"Maybe your statistics are stale and need to be updated?"

Nope.  I tried UPDATE STATISTICS WITH FULLSCAN.  No change


"C'mon Steve, clearly you need to re-index!"

Did that.  I tried DBREINDEX on the specific AK1IV30500 index, as well as the entire IV30500 table.  No change in the execution plan.


I didn't find any standard maintenance task that changed the behavior of the MAX Index Scan.

As a last resort, I used the Import/Export wizard to export all of the data from the IV30500 table into a new table that I called IV30500REBUILD.  I then ran scripts to create all of the same indexes on the REBUILD table, making it identical to the original IV30500 table.

I then ran the MAX and TOP 1 queries on the new REBUILD table.

And like magic, the MAX Index Scan returned just one row.


Same table structure.

Same data.

Same indexes.

But the Index Scan on the new REBUILD table behaves properly.

So there is apparently something about my IV30500 that is causing this problem, and rebuilding the entire table resolves it.  But rebuilding a table isn't exactly a typical SQL maintenance task, so it's not really a solution.

But this is way past my SQL skill level, so I don't yet know what conventional maintenance task might be able to achieve the same results.

I've asked for help from a true SQL expert, so I'm hoping that she will assist and help me figure out this mystery.




You can also find him on Twitter, YouTube, and Google+





Saturday, March 3, 2018

"Can you add just one more little feature?": A Story About Software and Home Improvement

By Steve Endow


Wife: "Steve, can you install an exhaust fan in the small bathroom?"
Steve:  "Sure, hunny, no problem. I just ordered the fan and I'll call Sam to install it."

Customer:  "Can you add this simple little feature to our application?"
Developer:  "Sure, no problem.  I'll get right on that."

No big deal.


Sam: "Steve, I cut a hole in the bathroom ceiling for the fan, but something is strange. There's an extra layer of drywall on the ceiling, and it's not attached properly and it's sagging."
Steve: "Hmmm, that doesn't look right. Let's remove the extra drywall and see what the prior homeowner was covering up."



Customer: "So how's that new simple little feature coming along?"
Developer: "Well, I looked through the code, and the original developers didn't design the software to handle this feature, so it's going to require some redesign of the customization."

I think the project scope just changed.

Friday, March 2, 2018

Sample Dynamics GP eConnect XML for RM Apply (RMApplyType / taRMApply)

By Steve Endow

A customer asked for sample XML for the RMApplyType / taRMApply eConnect transaction type.  I couldn't find one handy during a search, so I had to cobble together some .NET code and generate the XML.

I'm wondering if there is an easier way to generate the sample eConnect XML.  In theory, eConnect Requester with the eConnect Outgoing Service can send certain XML documents to MSMQ, but that is a hassle to setup properly, and I don't know that all transaction types are supported by eConnect Requester--such RMApplyType.


So, here is a sample Dynamics GP eConnect XML document for RM Apply (RMApplyType / taRMApply)


<?xml version="1.0"?>
<eConnect xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
            <RMApplyType>
                        <eConnectProcessInfo xsi:nil="true" />
                        <taRequesterTrxDisabler_Items xsi:nil="true" />
                        <taAnalyticsDistribution_Items xsi:nil="true" />
                        <taRMApply>
                                    <APTODCNM>SALES100001</APTODCNM>
                                    <APFRDCNM>PYMNT100001</APFRDCNM>
                                    <APPTOAMT>123.45</APPTOAMT>
                                    <APFRDCTY>9</APFRDCTY>
                                    <APTODCTY>1</APTODCTY>
                                    <DISTKNAM>0</DISTKNAM>
                                    <APPLYDATE>2017-04-12</APPLYDATE>
                                    <GLPOSTDT>2017-04-12</GLPOSTDT>
                        </taRMApply>
            </RMApplyType>
</eConnect>


Discount Taken, Apply Date, and GL Post Date are optional, and do not have to be assigned or included in the XML if they are not needed.

Here is the eConnect documentation on taRMApply.


Tuesday, February 27, 2018

I give away my source code to my customers

By Steve Endow

Dynamics GP partners and customers often hire me to develop custom Dynamics GP integrations, GP customizations, Visual Studio Tools AddIns for GP, or even custom web APIs for Dynamics GP.

I develop the solution, usually using .NET, then I prepare a deployment guide and deployment package that can be installed on the customer's servers.  The solution is tested, I fix some bugs and refine the solution, I prepare new deployments that get installed, and once everything looks good, the customer goes live.

Everybody's happy, and I'm all done, right?

Except for one critical piece.

Any guesses?

What about the source code?

"What about the source code?", you might reply.

I diligently check in my source code to Git, and the code is pushed to an online Git repository for safe keeping and accessibility.  And I have backups of backups of backups, both on site and off site.  Great.  So I'm all done, right?

Not really.

What if I disappear?  What if I win the lottery?  What if I decide that this whole modern civilization thing is overrated and go live off the grid?

Sunday, February 25, 2018

T-SQL: MAX vs. TOP 1 - Which is better??

By Steve Endow

If you need to get the largest value for a field, should you use MAX in your query?  Or should you use TOP 1 with ORDER BY?



Which is better?  Which is faster?  Is that always true?

Do you think you know the answer?

Place your bets, and then check out my video below, where I compare MAX vs TOP 1 on several Dynamics GP tables.

The results may surprise you!



Did I miss anything or make any mistakes in my testing?  Are there other considerations when choosing between MAX vs. TOP 1?

How to improve Dynamics GP with a little bit of VBA

By Steve Endow

I've had a few Dynamics GP customers that purchase software from me every few years, and a few of them have mailed checks to my old mailing address from 4 years ago. How can this happen?

Well, the Dynamics GP Payables Transaction Entry window does not display the vendor Remit To address, so verifying the vendor address is not an obvious natural step in the invoice entry process.  Yes, there is a link to open the Vendor Address Maintenance window, but what if internal controls prevents the user who enters vendor invoices from editing vendor addresses?  The user would need to go through a separate process to verify the vendor address...for every invoice.  Not ideal.

How can VBA help?

In just a few minutes, VBA can be added to the Payables Transaction Entry window to check if the vendor has not had a transaction in over 60 days, prompt the user to verify the vendor address, and even open the Vendor Inquiry window to review the current Remit To address in Dynamics GP.

It's really easy!

Here's a video discussing the background and walking through the entire process of adding the VBA to Dynamics GP.




First, within Dynamics GP, you add the desired windows to Visual Basic by clicking on Tools -> Customize -> Add Current Window to Visual Basic.


After the window is added, I click on Add Fields to Visual Basic and then click on the Vendor ID field.

Since I am assuming that the user entering invoices will not have access to edit vendors or vendor addresses, I'm going to add the Vendor Inquiry window to Visual Basic and add 3 fields on the Inquiry window to VB:  Vendor ID, Address ID, and the Address ID Next button.


Once I have those windows and fields added to Visual Basic, I press CTRL+F11 to open the Dynamics GP Visual Basic Editor.  If you don't have access to the VB Editor, you may not be licensed to use it, or you may not have permissions--in which case, talk with your GP administrator or GP partner.


In the VB Editor window, I'll select the PayablesTransactionEntry window on the left, then select the VendorID field and the AfterUserChanged event.

I wrote the simple VBA code below to demonstrate how you can quickly and easily add some VBA to add some valuable functionality to Dynamics GP to save users time and improve data entry.

The code finds the most document date for any vendor transaction in Dynamics GP, and if that date is over 60 days ago, it opens the Vendor Inquiry window and displays the vendor Remit To address for the user to review and verify.


Private Sub VendorID_AfterUserChanged()
       
    Dim strVendorID As String
    Dim strSQL As String
    Dim dtLastDocDate As Date
   
    strVendorID = VendorID.Value
   
    'Find the most recent document date for the vendor
    strSQL = "SELECT COALESCE(MAX(DOCDATE), '1900-01-01') AS DOCDATE FROM PM00400 WHERE VENDORID = '" & strVendorID & "'"
   
    Set oConn = UserInfoGet.CreateADOConnection
    oConn.DefaultDatabase = UserInfoGet.IntercompanyID
    Set rsResult = oConn.Execute(strSQL)
   
    strLastDocDate = Trim(rsResult.Fields("DOCDATE").Value)
   
    rsResult.Close
   
    'Get the Remit To Address ID for the vendor
    strSQL = "SELECT VADCDTRO FROM PM00200 WHERE VENDORID = '" & strVendorID & "'"
    Set rsResult = oConn.Execute(strSQL)
   
    strRemitID = Trim(rsResult.Fields("VADCDTRO").Value)
   
    rsResult.Close
    oConn.Close
       
    dtLastDocDate = CDate(strLastDocDate)
   
    'If Doc Date is 1/1/1900, vendor has no transactions
    If dtLastDocDate = CDate("1900-01-01") Then
        Exit Sub
    Else
       
        intDays = DateDiff("d", dtLastDocDate, DateTime.Date)
       
        'If the last doc date is > X days ago, display a dialog
        If intDays > 60 Then
       
            msgResult = MsgBox("This vendor has not had a transaction since " & strLastDocDate & " (" & intDays & " days ago)." & vbNewLine & vbNewLine & "Please review the current vendor Remit To address and compare to the invoice address", vbOKOnly, "Verify Vendor Address")
            VendorInquiry.Open
            VendorInquiry.VendorID.Value = strVendorID
            VendorInquiry.Activate
            VendorInquiry.Show
           
            While VendorInquiry.AddressID.Value <> strRemitID
                VendorInquiry.NextButtonWindowArea.Value = 1
            Wend
       
        End If
    End If


End Sub


In just a few minutes, you can have this customization running in Dynamics GP without any additional development tools.

If you have more complex requirements, you can easily add more advanced functionality using VBA.  If you prefer using a separate development tool, you could also develop this customization using .NET or Dexterity, but the appeal of VBA is its simplicity and ease of use.

So if you have some small problem or additional business requirement that you'd like to handle in Dynamics GP, VBA might come in handy.



You can also find him on Twitter, YouTube, and Google+





Saturday, February 24, 2018

Convert DEX_ROW_TS to a local time zone using AT TIME ZONE

By Steve Endow

This week I attended another great webinar by Kendra Little of SQLWorkbooks.com.  (If you aren't familiar with Kendra, check out her free webinars and her excellent catalog of online courses.)

One neat thing that Kendra always seems to do in her webinars and courses is subtly use new(er) SQL Server features.  This week, she just happened to use the AT TIME ZONE statement in one of her queries.  As soon as I saw it, I knew I had to try it with Dynamics GP.

Dynamics GP doesn't have much time zone sensitive data, but one field that I am starting to rely on more frequently is the DEX_ROW_TS field, which is now present in several key GP tables.  This field stores a last updated date time stamp.

DEX_ROW_TS is a bit unique for GP for at least two reasons.  First, it's a rare time stamp field.  While GP has many date fields, those date fields normally have a time of 00:00:00.000--so it's just a date at midnight, with no timestamp.

The second unique thing about DEX_ROW_TS is that it stores the datetime with a UTC timezone offset.  So if you ever query the DEX_ROW_TS field you need to remember that it isn't local time.

I previously wrote a post about DEX_ROW_TS and how to use some SQL date functions to convert the value to your local time zone, but that approach felt a bit like duct tape and twine, and I would have to look up the syntax every time to use it.

Enter the very cool SQL Server 2016 AT TIME ZONE function.  This function makes it very easy to assign a time zone to a datetime value, and then convert it to another time zone.

(I'm calling AT TIME ZONE a function for now because I haven't found a better name for it. It doesn't read like a typical function, but it acts like one, so Function is the best name I have so far. If you know of the proper technical name for it, let me know in the comments below.)