Friday, December 12, 2014

Executing SQL DDL scripts from a Dynamics GP VS Tools AddIn

By Steve Endow

I've developed many Dynamics GP Visual Studio Tools AddIns that depended on one or more SQL scripts.

Simple single-line SQL statements are easy to include in your code, but what about more complex queries?  If I have large queries that are more than a few lines, I typically try and push those out to stored procedures.  Stored procs are easier to maintain outside of code, and calling a stored proc from a VSTools AddIn is cleaner and simpler than maintaining a large query in Visual Studio.

Okay, so that may be fairly obvious, but how do you deploy those stored procedures?  Previously, I've included them as SQL scripts that had to be manually executed as part of the deployment process.  While this was simple and low tech, it is easy to forget a script or two if you have a complex deployment.

And what about Data Definition Language (DDL) scripts?  What if you need a custom table for your customization or integration, or several?  What if you don't want to have the client manually run 10 SQL scripts as part of your AddIn deployment?  And what if your AddIn needs those tables and objects to be setup every Dynamics GP company database?  And what if the client has 80 company databases?  Not a fun deployment.



After all, the great thing about AddIns is that the ideal deployment can be as simple as copying a single DLL file to the GP AddIns subdirectory.  If you have to manually run SQL scripts you start to lose that benefit.

This week I am working on a new VS Tools AddIn that will have at least 2 custom tables.  I was tired of dealing with SQL scripts as part of my deployment, so I was determined to fully automate all SQL scripts.  I want my AddIn DLL to automatically detect whether the required stored procedures and tables are present, and if not, automatically create them.  I've done this once in the past, but it was messy and a nightmare to maintain.  I wanted a better, cleaner method that was easy to maintain.  Fortunately, I found one.

I first wrote this simple method to check if a table exists:

public bool PSV10000Exists()
{
    //Check if table exists
    string commandText = "SELECT COUNT(*) AS TablePresent FROM DYNAMICS.INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'PSV10000'";
    SqlParameter[] sqlParameters = null;

    string result = DataAccess.ExecuteScalar(CommandType.Text, commandText, sqlParameters);

    if (result == "1")
    {
        return true;
    }
    else
    {
        return false;
    }
}


You could abstract this to be "TableExists" and have the method accept a database name and table name as parameters, but this was my prototype.

Great, so now you know whether the table exists.  What if it doesn't?

Well, then you just execute the DDL script to create the table, right?

Nope.  Not that easy.

ADO.NET is great for straight queries or individual / discrete DDL commands, but if you have a complex statement with several "GO" commands included, it can't parse the command and will throw an error.

My table creation script has several commands, each followed by a GO command.

CREATE TABLE
CREATE NONCLUSTERED INDEX
ALTER TABLE ADD  CONSTRAINT  (several of these)
GRANT


So how do you run a multi-step script with numerous GO commands?

It turns out that there is a different "ExecuteNonQuery" command in .NET, tucked away in the SQL Server Management (SMO) assemblies.  SMO allows you to execute complex multi-step SQL scripts, including DDL scripts with multiple GO commands.

To use SMO, you first need to add four SMO references to your project.  The DLLs are available at:

C:\Program Files\Microsoft SQL Server\120\SDK\Assemblies\

And you'll need to add these references:

Microsoft.SqlServer.ConnectionInfo.dll
Microsoft.SqlServer.Management.Sdk.Sfc.dll
Microsoft.SqlServer.Smo.dll
Microsoft.SqlServer.SqlEnum.dll

Once you have those references setup, you'll need to add these statements to your Data Access class.

using Microsoft.SqlServer.Management.Common;
using Microsoft.SqlServer.Management.Smo;


With that taken care of, you can create a simple method to call the SMO version of ExecuteNonQuery.  Here is mine, which relies on the Connection property of my DataAccess class to get a connection to the GP SQL server via GPConnNet.

public static int ExecuteDDLScript(string commandText) 
{
    SqlConnection gpConn = new SqlConnection();
    try
    {
        gpConn = Connection;  //Get connection from GPConnNet
        Server server = new Server(new ServerConnection(gpConn));
        int result = server.ConnectionContext.ExecuteNonQuery(commandText);
        return result;
    }
    catch (Exception ex)
    {
        throw ex;
    }
    finally
    {
        gpConn.Close();
    }
}


So now you have all of the plumbing to execute a DDL script.  Now what?

Let's start by getting our DDL script into Visual Studio so that our code can access it.  But where do you store a 20, 30, 50 line SQL script?

I am a huge fan of the Visual Studio Resources feature, which is a fantastic place for storing and managing large SQL scripts.


You define a resource name, paste in your entire SQL script, and add some comments if desired.  I like to note the version number of the scripts, as I often have to refine them during development.

Once you have your Resources setup, you can then put it all together in a simple method that actually executes the script.

I decided to create a new SQLObjects class, separate from my DataAccess class.

In my SQLObjects class, I created this new method.

public bool CreatePSV10000()
{
    string commandText = Properties.Resources.sqlCreatePSV10000;
    int result = DataAccess.ExecuteDDLScript(commandText);

    if (result == -13)
    {
        return true;
    }
    else
    {
        return false;
    }
}

Look at how simple that is!  It's essentially two lines to create your table if it doesn't exist.


No manual SQL scripts required, and your AddIn can do all of the work automatically, greatly simplifying your deployment.

You may have noticed the -13 value, which is a bit odd.  In my testing, that appears to be the integer returned by the SMO ExecuteNonQuery method when the script is run successfully.  I haven't yet been able to find a list of the return codes for the SMO ExecuteNonQuery method, so I'm still researching to see if -13 is an unconditional success response, and what other values I need to look out for.

Now that I have this method setup, I look forward to never having to include SQL scripts in my deployment guides again!


UPDATE:  I looked again at the documentation for the SMO ExecuteNonQuery--specifically the ServerConnection.ExecuteNonQuery method.  Regarding the integer return value, it says:

The return value specifies the total number of rows affected by the Transact-SQL command for UPDATE, INSERT, and DELETE statements. For all other types of statements, the return value is -1.
Since I am running a DDL script, the return value should be -1, whereas I was getting -13.  Then I realized that my DDL script consisted of numerous statements separated by the GO command.  Which made me wonder if I happened to have 13 statements in my script.  I did a test where I removed all of the GO commands--my script still worked fine, and the return value was -1.  When I added 4 GO commands, the return value was -4.

So I learned that why I got a return value of -13, and I also happened to learn that all of the GO commands in a DDL script created by SQL Server Management Studio are not required.


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter







No comments: