Thursday, September 29, 2016

Fix Dynamics GP scaling and font size issues on high DPI displays

My blog has moved!  Please visit the new blog at:

I will no longer be posting to Dynamics GP Land, and all new posts will be at


By Steve Endow

Update: If you are also having issues using Remote Desktop with high DPI displays, check out this post by Nathan Hayden at BKD on a workaround for RDP apps:

If you have a relatively new Windows laptop with a high resolution display or if you have 4K monitors on a Windows desktop, you are likely familiar with the numerous software display issues related to these "high DPI" displays.

If you use a 4K monitor at full resolution, the fonts and text in Windows is microscopic.  To make text readable and buttons large enough to click, Windows uses "scaling", whereby it increases the size of text and graphics to make them readable and large enough to actually use.  (I'm sure there's a huge technical discussion about the details, but I don't care about any of that, I just want to get work done.)

So when you run Windows 10 with a high DPI display, it will typically recommend running at 200% scaling.  I prefer to use 175% to get more real estate on my displays.

This Scaling technique generally works well, and with many applications, you may not even realize that scaling is occurring...

Until you launch an application that isn't "high DPI aware" or high DPI compatible.  Then you'll see a complete mess. I use SugarSync, which does not handle scaling very well.  Notice the text on the left is very small compared to the folder names on the right.

It's annoying, but tolerable.

I thought that Dynamics GP 2015 displayed fine on my Surface Pro 4 when I first installed it, but I rarely use GP on my SP4, and when I launched it earlier this week, I saw this odd mix of font sizes.  A colleague told me the same thing--GP used to display properly on his Surface Pro 4, but recently the fonts went haywire.

Notice the menus at the top are very large, while the text on the left panes is tiny.

Similarly, SQL Server Management Studio 2014 is a hot mess on high DPI displays.  It was the SSMS 2014 issue that had me searching for a solution.  And I came across this brilliant post.

Clearly the author has a pretty good understanding of the high DPI settings in Windows and figured out how to change the scaling method on a per-application basis using a manifest file.

I don't understand any of the mechanics, but I followed his instructions to import the extra registry key, then saved the Ssms.exe.manifest file to my machine.  Like magic, SSMS 2014 displayed properly.  The bitmap scaling on my displays at 175% is a bit fuzzy, but I'm okay with that, since it makes the applications much more usable.  I can always try 150% or 200% scaling if necessary if the blurriness bothers me.

Since it worked for SQL Server Management Studio, I then wondered--could it work for Dynamics GP?

(Full credit goes to Gianluca Sartori for his article showing how to implement this fix--I didn't come up with any of this)

I created a Dynamics.exe.manifest file...and behold.  GP launched just fine and rendered "normally", without the funky font sizes.

On a roll, I made a manifest file for SugarSync, and was pleased to see that it fixed SugarSync's scaling as well!  SugarSync already had a manifest file, so I just edited it to add the additional settings from the SSMS 2014 sample.

I don't understand any of the settings in the manifest file, but it seems to be fairly generic and flexible, as I've just used the technique for 3 different applications.

If you want to give this a try, you do so at your own risk.  If you don't know how to edit the registry or fix things if your computer goes haywire doing this, you should "consult an expert" to assist you.

Save this text to a file called "Scaling.reg".

Windows Registry Editor Version 5.00

Once you have created the "reg" file, double click on it to import it into the Windows registry on the machine where Dynamics GP is installed.

Then save this text to a file called Dynamics.exe.manifest.

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0" xmlns:asmv3="urn:schemas-microsoft-com:asm.v3">
    <assemblyIdentity type="win32" name="Microsoft.Windows.Common-Controls" version="" processorArchitecture="*" publicKeyToken="6595b64144ccf1df" language="*">
    <assemblyIdentity type="win32" name="Microsoft.VC90.CRT" version="9.0.21022.8" processorArchitecture="amd64" publicKeyToken="1fc8b3b9a1e18e3b">
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
      <requestedExecutionLevel level="asInvoker" uiAccess="false"/>
  <asmv3:windowsSettings xmlns="">
    <ms_windowsSettings:dpiAware xmlns:ms_windowsSettings="">false</ms_windowsSettings:dpiAware>

Then copy your new Dynamics.exe.manifest file to the GP application directory where the Dynamics.exe file is located.

Give it a try and let me know if it works for you.  I'm curious if it fixes some, most, or all of the situations where GP is apparently displayed using small fonts.

UPDATE: It appears that GP 2013 may handle scaling differently than GP 2015.  With GP 2013, I'm seeing a completely different mess of font sizes, and the GP window fonts are extremely small.  I used the Dynamics.exe.manifest file and it now displays properly.


After adding the manifest file:

Huge difference.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Google+ and Twitter

Copying Purchase Orders with Project Accounting Information

There are so many metaphors I could use right now, like...

When you are so thirsty, and you take a big old swig out of an old water bottle only to find it contains nothing but backwash...

When you wanted a specific thing for Christmas, and you unwrap your last present and think you got it (because of the box) only to find you got socks and underwear...

Such it is sometimes with features in Dynamics GP that don't apply to you if you use one of the "specialty" modules like Project Accounting, Human Resources, Manufacturing, etc.

This week I had a client who was so excited to discover that there is a copy feature for purchase orders (Actions..Copy PO Lines to Current PO, Create and Copy New PO). 

They were SO excited. Until they tried to actually use it.  Because, see, they use project accounting.  So this is what they got...wah-wah....

If you agree that this should be changed, help me out by voting for the product suggestion on MS Connect! Help a gal out on this one!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, September 23, 2016

Year End Is Coming! I know, I know...

So this week has been a whirlwind, I started the week in Fargo at Reimagine and then headed over the Indianapolis for the inaugural BKD Women's Empowerment Conference.  I have to say that it was one of the most inspiring and energizing weeks I have had in a long time.  Starting out at Reimagine, it is always fun to reconnect with old friends and share in all of the great things we are doing collectively to move clients forward with their software.  And then at #BKDWomenLead, I got to spend 2 days with the women of our firm learning how to build momentum towards our own goals.  All in all a great week, and I am reminded how fun my job is!

And then I get the email from Terry Heley, reminding me that year end is indeed coming.  I always feel like September rounds the corner in to the downward slide that is year end.  So here is to being prepared!  So here are a few "heads up" tips as you look ahead:

  • Remember! Only supported versions for year end updates (1099s, W-2s, Tax Changes, etc) are GP 2013, GP2015, and GP2016.  That's it.  If you are on GP 2010 or earlier, you need to get upgraded now.
  • As we learned at Reimagine earlier this week, GP 2016 R2 will drop in the 4th quarter.  So the year end update for clients already on GP 2016 will move them to GP 2016 R2 (so the update and year end handled in one pass)

One of the most well-received pieces of news out of Reimagine earlier this week was the move back to an annual release cadence (instead of the more recent 6  month cadence).  So that means after GP 2016 R2 we won't see GP2017 until late in the new year.

Start your engines, the fun times are a'coming!

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Wednesday, September 21, 2016

Dynamics GP GPConnNet.dll is not compatible with SQL Server high availability setup or DNS aliases

By Steve Endow

A client recently upgraded to Dynamics GP 2016.  As part of their GP upgrade, the client setup new SQL Servers in a dual-server high availability (HA) cluster.  The ODBC DSN is setup to use a DNS name that points to the SQL Server cluster.  The ODBC DSN looks something like this:

The name "" is not a physical machine.  It's a DNS entry that points to an IP address for the SQL cluster.  The SQL cluster decides which of two SQL instances / machines should receive the requests.

Using these DSN settings, you can setup GP just fine.  When you create users based on this DSN and fully qualified server name, GP works great and users can login.

But this configuration can cause problems with applications or utilities that are not designed to work with the SQL Server HA setup or DNS aliases.  One such incompatible utility is the GPConnNet.dll library.

GPConnNet is used by .NET developers who create Dynamics GP AddIns that require a connection to the GP SQL Server.  It allows the developer to request a connection to SQL without knowing the server, username or password.  You simply ask GPConnNet to provide you with an active connection to the SQL Server.

Well, in an environment using SQL Server HA, GPConnNet is unable to connect to the SQL Server instance.  Rather than looking at the ODBC DSN server setting for the GP client, GPConnNet uses some other technique to determine the SQL Server instance name.  As a result, it gets the physical SQL instance name, which in a SQL HA environment, is different than the fully qualified cluster name.

This dialog shows that GPConnNet returned an error number of 131074, which indicates that the connection was successful, but that the login failed.  You can see that the Data Source value ends in "DB01", which is the name of one of the physical SQL instances, and not the cluster DNS name.

As a result, when GPConnNet tries to connect to the physical SQL instance, the login fails.  Because the Data Source name is used, the user password is encrypted differently, and an incorrect password is then sent to the SQL Server.

GPConnNet could likely be modified to fix this issue, but I'm pretty sure that isn't going to happen.

There might be a way to configure SQL Server HA to work around this issue with GPConnNet, but I'm skeptical about that, and even if there was, it would require this customer to completely re-implement their HA setup and GP configuration.

I offered the customer the option of customizing my software to use a configuration file to store SQL credentials to work around this issue.  But the client informed me that they have encountered several other issues with the HA setup, and as a result of this additional issue, they are abandoning SQL HA for their GP environment.  They will be switching over to a standard single SQL Server machine and will be eliminating the use of a DSN entry to point to the SQL cluster.

So while SQL HA for Dynamics GP might be a sensible goal, be aware that other software may not work properly in such an environment.

You can also find him on Google+ and Twitter

Tuesday, September 20, 2016

Bug in eConnect taPMManualCheck for Credit Card and Cash Payables Manual Payments

By Steve Endow

I'm working on an eConnect integration that imports Payables Manual Payments.  The integration itself is working fine, but after adding support for the Credit Card payment method, I noticed an oddity.

If I manually enter a Payables Manual Payment for a Credit Card Payment Method, it looks like this.

Notice the Amount fields in the lower right corner.  The payment amount is for $111.11, and that amount is displayed in both the Unapplied and Total fields on the left.

However, this is what an eConnect imported Credit Card payment looks like.

While testing, I noticed that the Total field is zero for the imported Credit Card payment.

If I query the PM10400 records for both of the Credit Card payment transactions, I see that the CHEKAMNT field is zero for the eConnect imported payment, and has a value for the payment entered manually in GP.

The eConnect taPMManualCheck node only has one amount field, which is DOCAMNT, so I don't think I'm missing an amount field value in my submission to eConnect.

After reviewing the SQL for the taPMManualCheck stored procedure, I think I found the bug. To verify my suspicion, I imported a new payment with Payment Method of Cash.  Sure enough, the imported Cash payment has the same issue: the Total field is zero.

And the record in PM10400 has the same issue--the CHEKAMNT field is zero.

Looking at the Insert statement that is being called by the taPMManualCheck stored procedure, you can see that it is inserting the value using a parameter called @CHEKAMNT.

The fact that the field is called CHEKAMNT and the parameter is called @CHEKAMNT should be a clue as to the genesis of the problem.

It would seem that the Payables Manual Payment Entry was originally designed to only record payables "checks", and was subsequently modified to support Credit Card, Cash, and more recently EFT.

eConnect was updated to support Credit Card and Cash (it does not support EFT), but when updating the taPMManualCheck procedure for those two new payment methods, this "Total" amount / CHEKAMNT bug was introduced.

For some reason, two new parameters were added to the stored procedure to differentiate the Cash Amount and Credit Card Amount values.

But although these parameters are set based on the payment type, only the @CHEKAMNT value is used for the insert to PM10400.  So it looks like the developer forgot to make sure that the appropriate amount parameter was used for the actual Insert.

So how do we work around this issue?

One easy approach is to use the taPMManualCheckPost procedure to fix the problem.  This script in the Post proc appears to work well based on a quick test.

However, if you have ever used the eConnect Pre or Post procedures, you likely know that the big downside is that they typically get wiped out when a GP service pack is installed or GP is upgraded.  Nearly everyone (including myself) forgets that they have the custom Pre / Post scripts, and it takes a few weeks after an update or upgrade before they realize they have bad data in GP due to the missing Pre / Post proc.  So for that reason, I am not a huge fan of the Pre and Post scripts, despite their convenience.

So for my application, instead of using the Post script, I'm going to add the SQL update to my .NET code, so that my application performs the update of the CHEKAMNT field after the Manual Payment is successfully imported.  That way the customer doesn't need to install a custom stored procedure and remember to update it after any GP upgrades.

And with that long winded explanation of another eConnect bug, may all of your future integrations bring happiness and joy to your Dynamics GP users.

You can also find him on Google+ and Twitter

Limited Users? Light Users? Self Service Users? SMH?

I had a bit of a whirlwind trip up to Fargo this week for Reimagine 2016.  It was a quick trip, mostly due to client commitments and our inaugural BKD Women’s Empowerment Summit later this week in Indy.  But while I was up at the mothership, I had the privilege of presenting a session on delineating the functionality  of Limited Users and Self Services in the Perpetual Licensing Model.  There definitely seemed to be a fair amount of hesitation in the room, in terms of everyone feeling confident about the licensing types (Beyond the full license) and accompanying functionality. 

I realize that this seems like a sales topic, but in my completely anecdotal experience—the majority of opportunities to sell Limited and Self Service licenses come up during or after implementation (not during the original sales cycle).  So consultants should be aware of these differences.  And clients should understand too, so they can avoid over-buying full licenses (which is very common) when a Limited or Self Service user would be more appropriate (and cost less).

So here are a few high points…

First, let’s throw out old terminology.  Light users, Business Portal Users, Employee Users, HRM Self Service Users—all old terms from Business Ready and Module Based Licensing. The three terms we have today in Perpetual Licensing – Full, Limited, Self Service.  And you want to think of them in that order…

Full is what we are used to, full is input and output and access to all of the functionality in GP.   This is concurrent licensing in a traditional (non-subscription) licensing environment.

Limited is a step down, focused primarily on output only.  Inquiries, SmartLists, and Reports.  This license type can also do limited input via the self service functionality (see self service user below, the limited user license includes this functionality).  This is also a concurrent user license (a point of confusion for many) in a traditional licensing environment.

Self Service is the most limited of the types, with access only granted to the input/output functions of the self service areas: Payroll/HR (Benefits, Direct Deposit, Skills and Training, Employee Profile, W4 Withholding, Benefit Self Service), Project Time and Expense, and Purchase Requisitions.   These licenses are NAMED licenses.

Limited and Self Service users are set up in the same windows as a full user in Dynamics GP.  And Dynamics GP comes with predefined security roles for each (Limited roles and ESS roles for Employee Self Service).  For more info on the security roles, check out this KB article and blog post from Pam M.

Pricing flows accordingly, with full users being the most and self service being the least.

For transitions from prior licensing models, things flow pretty obviously with a few exceptions.  Most BP licensing (employee  users, HRM self service, etc) will flow to self service  users.  And the previously named Light user flows to Limited user (with the exception of customers who bought Light User- Limited Upgrade and Light User- Starter Pack licenses under prior licensing models).

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Friday, September 9, 2016

Another One To Share: Reconciling Payroll Summary

A quick share to end the day, if you can successfully reconcile your 941 manually by taking gross wages and subtracting tax sheltered deductions and adding taxable benefits, but your payroll summary does not match...this script is for you!

This will display the wages as calculated for the payroll summary, and then also display (based on the transaction history) the tax sheltered deductions (divided in to two categories, sheltered from all taxes or just fed and state) and taxable benefits (also divided in to two categories, taxable by all taxes or just fed and state).  This will allow you to manually calculate (by employee) what you would expect the federal and fica wages to be.  If the manual calculation differs from the amounts display for the payroll summary, then you know you need to make an adjustment to correct.  Most often these discrepancies can be traced to tax flags being changed after transactions were processed.

Happy hunting!

--Summarizes deductions by employee for a period of time that are sheltered from all taxes
With CTE_DedTax as (Select employid, SUM(UPRTRXAM) as TotalDedTaxSheltered from UPR30300 where CHEKDate<='6/30/2016' and chekdate>='4/1/2016'
and PYRLRTYP=2 group by EMPLOYID),

--Summarizes deductions by employee for a period of time that are sheltered just from federal and state
CTE_DedFS as (Select employid, SUM(UPRTRXAM) as TotalDedFedStateSheltered from UPR30300 where CHEKDate<='6/30/2016' and chekdate>='4/1/2016'
and PYRLRTYP=2 group by EMPLOYID),

--Summarizes benefits by employee for a period of time that are taxable by all taxes
CTE_BenTax as (Select employid, SUM(UPRTRXAM) as TotalBenTaxSheltered from UPR30300 where CHEKDate<='6/30/2016' and chekdate>='4/1/2016'
and PYRLRTYP=3 group by EMPLOYID),

--Summarizes benefits by employee for a period of time that are taxable by just federal and state
CTE_BenFS as (Select employid, SUM(UPRTRXAM) as TotalBenFedStateSheltered from UPR30300 where CHEKDate<='6/30/2016' and chekdate>='4/1/2016'
and PYRLRTYP=3 group by EMPLOYID),

--Summarizes wage information by employee as reported on the payroll summary report
CTE_PRSummary as (Select sum(fdwgpyrn) as FedWages,
sum(ficasswp) as FICASSWages, sum(ficamwgp) as FICAMWages, sum(GRWGPRN) as GrossWages, Employid as EmployID
from UPR30100 where year1='2016' and CHEKDate<='6/30/2016' and chekdate>='4/1/2016' group by EMPLOYID)

--Pulls all information together for comparison
select CTE_PRSummary.Employid, CTE_PRSummary.FedWages, CTE_PRSummary.FICASSWages, CTE_PRSummary.FICAMWages,
CTE_PRSummary.GrossWages, isnull(CTE_DedTax.TotalDedTaxSheltered,0) as TotalDedTax, isnull(CTE_DedFS.TotalDedFedStateSheltered,0) as TotalDedFS,
isnull(CTE_BenTax.TotalBenTaxSheltered,0) as TotalBenTax, isnull(CTE_BenFS.TotalBenFedStateSheltered,0) as TotalBenFS from CTE_PRSummary
left outer join CTE_DedTax
on CTE_PRSummary.EmployID=CTE_DedTax.EmployID
left outer join CTE_DedFS on CTE_PRSummary.EmployID=CTE_DedFs.EmployID left outer join
CTE_BenTax on CTE_PRSummary.EmployID=CTE_BenTAX.EmployID left outer join CTE_BenFS on CTE_PRSummary.EmployID=CTE_Benfs.EMPLOYID

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Some Things Never Change: Reconciling Payroll

Building on my post from a few weeks ago.  A few more scripts. For 15 years now, I have been doing this.  Doing the GP thing.  And for 15 years, I have been reconciling payroll taxes.  My job and career has changed tremendously in 15 years, but just when I start to think I might never have to sit down with a 941, Payroll Summary, and a bunch of Check Registers again....

So I thought I would share a few scripts I use in the course of reconciling.  These are my quick check Captain Obvious sorts of things.  These are usually most beneficial during the first year of a go live, when setup and settings might have been corrected or adjusted in the early days of the go live.

Hope these are helpful!

-Compare deduction tax flags by employee (Cards-Payroll-Deduction) to the deduction tax flags in setup (Setup-Payroll-Deduction)
-Display list of exceptions when these settings differ
-This may identify potential reconciliation issues by highlighting items that may individually impact wages differently than expected

select UPR00500.EMPLOYID as EmployeeID,

UPR00500.DEDUCTON as Deduction, 

UPR00500.SFRFEDTX as ShelteredFed,

UPR00500.SHFRFICA as ShelteredFICA,

UPR00500.SHFRSTTX as ShelteredState

from UPR00500 inner join UPR40900 on UPR00500.DEDUCTON=UPR40900.DEDUCTON




-Same logic as the deduction script above, only for benefits

select UPR00600.EMPLOYID as EmployeeID,
 UPR00600.BENEFIT as Benefit, 
 UPR00600.SBJTFDTX as SubjectFed,
 UPR00600.SBJTFDTX as SubjectFed, 
 UPR00600.SBJTSSEC as SubjectFICA, 
 UPR00600.SBJTMCAR as SubjectMed,
 UPR00600.SBJTSTTX as SubjectState 
 from UPR00600 inner join UPR40800 on UPR00600.BENEFIT=UPR40800.BENEFIT

Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Thursday, September 8, 2016

Updating a .NET development environment for a new version of Dynamics GP

By Steve Endow

I have developed many .NET AddIns for Dynamics GP.  They are great because they are relatively simple to develop, very simple to deploy, and they are also easy to update for new versions of Dynamics GP.

However, the big catch is preparing my development environment for a new version of GP.

I've been working on updating one of my development virtual machines to GP 2016 for the last TWO DAYS.  It hasn't been two full days of work, but it has been 15 minutes of starting an install, then waiting 30+ minutes for installs to complete.  I'll go work on something else, then come back whenever I remember and work on the next item.

In this particular case, I have a virtual machine dedicated to Project Accounting development.  PA is a unique beast, so I keep a separate VM just for PA related projects.  The dedicated VM makes it easier for me to deal with PA related config and development on this one machine, but it also means that I have to maintain and update this VM separately.

So to upgrade my PA customization to GP 2016 on this machine, I have to:

1. Install a new instance of SQL Server
2. Install GP 2016
3. Create new GP 2016 test company
4. Get PA configured in GP 2016 the way I need it
5. Install Visual Studio 2015 Update 1 from an ISO (since I only had VS 2013), which takes forever
6. Install Visual Studio 2015 Update 3 separately (6GB file!?!?!), which also takes forever
7. Create a new Git branch for the GP 2016 version of my customization
8. Finally work on updating my customization to GP 2016
9. Test the new version on GP 2016
10. Prepare a new release for GP 2016

I started this process on Tuesday.  I'm on Step 6, and it's now Thursday afternoon.

Actually updating the Visual Studio solution to work with a new version of GP is usually simple and relatively quick, but all of the other tasks to prepare for that code upgrade can be very, very time consuming.

May the Patience Be With You.

You can also find him on Google+ and Twitter

Saturday, September 3, 2016

Backing up terabytes of data off site

By Steve Endow

A few years ago I was backing up my files, photos, music, code, and massive Hyper-V VHDs to an old windows file server.  That server finally started dying, so I bought a Synology DS1813+ with 12TB of usable storage (RAID 10) and transitioned all of my photos, music, video, system backups, and VHD backups to the new NAS.

TONS of space.  That'll take me a long time to fill up.  Famous last words.

In about a year, I had filled 80%.  Admittedly, I lack a coherent backup and archive retention strategy, so I could probably have freed up some space, but in using the Synology, I realized two things.

1. I have a pretty massive amount of data stored centrally on my NAS
2. I don't have a backup for my NAS

When I first setup the Synology, I had tried doing backups from the Synology to two external USB drives, and rotating those external drives with two other external drives weekly.  But swapping out drives every week gets really old.  And those 2TB drives filled up quickly, requiring yet another layer of storage to manage.

I wanted a better "archive" solution for the data on my NAS.

The obvious solution was... to buy another bigger NAS!

What better way to backup a NAS than to back it up to another NAS?  The logic is genius.

But there were a few benefits.  Not long after I purchased the DS1813+, Synology released the DS1815+.  One improvement was a significant increase in CPU performance.  While you don't typically shop for a NAS based on CPU performance, it does really matter.  RAID array builds, consistency checks, heavy traffic, multiple tasks or doing any backup file encryption can peg the CPU, so MOAR CPU power is a good thing.

As an aside, here is the resource utilization of both while the 1815 is backing up data to the 1813.



So I got the DS1815+ setup with 21TB of usable storage, which is still very spacious, even after transferring all of my data off the 1813+.  I then wiped my old DS1813+ and repartioned it to provide 18TB of usable storage.  I installed the Synology HyperBackup and HyperBackup Vault on the two units and setup a job to backup data from the 1815 to the 1813.  That all went pretty smoothly.

The plan was then to setup the DS1813+ off site with a friend or family member at least 20 miles away, and configure the backup job to run over the internet.  The reason I went with this approach is that cloud backup solutions at that time were still moderately expensive for large amounts of storage. I priced out the cloud backup providers that were supported by the Synology backup software, and if you needed to backup TB of data, the fees easily justified getting a second NAS.

Then summer came around and then I was on vacation for a month, so things were put on hold.

Not too longer after I got back from vacation, Synology released a new version of DSM that added support for Backblaze B2.  B2 is an online data storage service that is incredibly inexpensive--basically $5 per month per TB.  So I could selectively backup 5TB of data for just $25 a month.  And that data would be stored in a Backblaze data center, not at a friend's house or business.  Yes, there are some storage options that can compete with B2 on price, like Amazon Glacier, but I didn't want to deal with the caveats that come with that type of storage--even if they might be more economical long term.

For some reason Synology only supports Backblaze through its Cloud Sync app, not through HyperBackup.

So I signed up with Backblaze B2, configured the Synology Cloud Sync app, and slowly added additional directories to the B2 backup job.  At the moment, I have about 1TB backed up to Backblaze, with all files encrypted by the Synology prior to upload.

I received an email from Backblaze for my monthly charges of a whopping $3.85.  Next month, if my usage is still 1TB, it should be about $5.

I still have the DS1813+, so I may go ahead and set it up off site and share it with friends, allowing them to do off site backup, but for the moment, it looks like backing up to Backblaze should work for me as an inexpensive off site solution for several TB of data.

You can also find him on Google+ and Twitter

Thursday, September 1, 2016

Saving Time and Reducing Repetitive Typing in Visual Studio

By Steve Endow

Writing code in Visual Studio often involves a lot of repetitive typing.  Things like variable declarations, method stubs, and properties are often 95% the same, except for a variable / property / method name.

There are some tools in Visual Studio that can help save time, like code snippets that I have discussed previously, but code snippets are a bit of a hassle to define.  And there are many simple things that you have to type repeatedly, or that need variable values, that just aren't well supported by Visual Studio automation or intellisense.

From what I've read, macros were removed from Visual Studio, so I was unable to find any native features that could help me easily automate simple text statements.

[UPDATE: Tim Wappat pointed out that there is a VS Extension that apparently adds macro support back to VS.  This might be a good alternative for your use case, but I'm guessing it's overkill for my needs. ]

One simple example would be my code comments.

I like to use a format of the date, then my name, then the comment.  Since I usually add a comment with nearly every change I make, I am often adding dozens of comment lines throughout my code.

Sure, it isn't terribly difficult to type:

//9/1/2016: S. Endow:

But after you've typed it thousands of times over a few years, it becomes obvious that there isn't much value in typing it all out.  So I finally got sick of typing it and looked into apps that provide shortcut keys to insert text values.  I wanted something that could automatically type out the comment line and insert the current date.

There are HUNDREDS of such apps out there, but I found that they each tend to focus on a particular feature set, and many of them didn't do exactly what I wanted.  So I had to sift through dozens to find those that allowed me to insert text when I pressed a keyboard shortcut.  Kind of like pre-defined clipboard entries that I could map to custom keystrokes.

The three that I have tried so far are:

1. QuickTextPaste (free)
2. PhraseExpress ($49 - $219 per copy)
3. FastKeys ($9.99 per copy)

I used QuickTextPaste for quite a while.  While it can work fairly well, it is very rough, with bizarre menus, finicky interface, and some frustrating quirks that I had to tolerate.  It's great that it is free, but I was willing to pay for something better.

I then tried PhraseExpress. It works fine, but it has a bug or two, and the price was way too expensive for me, as I want to use it on my desktop, laptop, and multiple virtual servers, and they require a license for every machine where it is installed.

But today, after searching for the 20th time for similar apps, I tried FastKey.  At only $9.99, it's an amazing bargain, and at that price, I'm happy to buy 5 or 6 licenses to use on my various machines.

So I installed it on one of my Window Server 2012 R2 virtual servers and tested it with Notepad.  It worked great, and the features are pretty phenomenal, especially for the price.

But when I tried to use it in Visual Studio 2015, it didn't work.  It wasn't detecting the keyboard shortcuts.  I then tried VS 2013 on the server, but that didn't work either.  This is an issue I experienced with QuickTextPaste on this same server, which prompted me to try a new app.

I then tried it on a Windows Server 2008 R2 and it did work in VS 2013.  I then tried it on a Windows 10 machine with VS 2015, and it worked there, to my surprise.

On the Server 2012 R2 machine, I tried turning UAC down to the lowest level, but that didn't help.

So there was something about Server 2012 R2 that was preventing it from working in Visual Studio 2013 and 2015.  I posted a note on the FastKeys support forum, and within a few hours received a response suggesting that I try using Run As Admin with FastKeys.  Duh.  I should have thought of that, but it didn't occur to me, since I had tried turning "off" UAC.

Sure enough, that worked.  When FastKeys is Run As Admin, it does detect the keyboard shortcuts used in Visual Studio, and inserts the desired text.

But there is one more catch.  Windows Server 2012 R2 does not allow a startup app to be run as administrator.  When I set the EXE to run as admin, FastKeys did not launch when I logged into the server.

After some searching, I found this thread recommending using Task Scheduler to launch the app at login.  Apparently Task Scheduler can launch an app with elevated privileges and bypass UAC completely.

With that crazy maze of indirect processing setup to bypass UAC restrictions, FastKeys finally launched with admin rights when I logged in and it worked in Visual Studio.

Mission.  Accomplished.

You can potentially use the text expander feature, but you have to be careful that none of your text shortcuts conflict with the thousands of potential Visual Studio intellisense options--which is not easy.  One TextKeys example uses a comma prefix, so ",ccc" would trigger an action.  That's probably what you'll need to do if you use text expander with Visual Studio.  But for now I'll probably stick with keyboard shortcuts, as I don't have too many items I need to insert.

FastKeys has far more features than I need or will ever use, like a custom Start Menu, Auto Complete, Gestures, and options to launch applications and do complex text substitution.

I may eventually use some of those fancier features, but for now, I'm just happy that I'm able to automatically insert comment lines into my code in Visual Studio.

Like I mentioned, there are a ton of apps like this, so regardless of which one you choose, I would recommend that you try using one to save time typing.

May your coding be fast and efficient.

You can also find him on Google+ and Twitter