tag:blogger.com,1999:blog-6691994129222744759.comments2024-03-18T02:11:32.263-07:00Dynamics GP LandChristina Phillipshttp://www.blogger.com/profile/03332221198245457747noreply@blogger.comBlogger983125tag:blogger.com,1999:blog-6691994129222744759.post-24844138897376929402019-04-11T12:20:14.843-07:002019-04-11T12:20:14.843-07:00We actually found that on our server, it was postm...We actually found that on our server, it was postmaster that was generating these messages! Running the registry fix, then restarting postmaster fixed the issue.scottokhttps://www.blogger.com/profile/05282031669239409229noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-70421021634207767292019-04-10T08:51:05.779-07:002019-04-10T08:51:05.779-07:00It is really telling that MSFT still was not fixed...It is really telling that MSFT still was not fixed this install blunder.<br /><br />There is a much simpler and more direct way to fix this than messing with credentials. Use the following powershell script in an elevated PS prompt on the machine you are running eConnect from. You may still need to reboot but this should add the sources eConnect is complaining about:<br /><br />function CheckCreateEventSource ([string] $aEventSource) {<br /> if ([System.Diagnostics.EventLog]::SourceExists($aEventSource) -ne "True") {<br /> [System.Diagnostics.EventLog]::CreateEventSource($aEventSource, "Application")<br /> Out-String -InputObject "Created EventSource: $aEventSource"<br /> }<br /> else {<br /> Out-String -InputObject "$aEventSource already exists."<br /> }<br />}<br /><br />CheckCreateEventSource("Microsoft.Dynamics.GP.eConnect")<br />CheckCreateEventSource("Microsoft.Dynamics.GP.eConnect12")<br />CheckCreateEventSource("Microsoft.Dynamics.GP.eConnect14")<br />CheckCreateEventSource("Microsoft.Dynamics.GP.eConnect16")<br /><br />Write-Host -NoNewLine "Press any key to continue..."<br />$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")Anonymoushttps://www.blogger.com/profile/05587093748437298656noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-83546661842356683042019-04-05T15:47:04.263-07:002019-04-05T15:47:04.263-07:00Can anyone advise adding the equipment and payroll...Can anyone advise adding the equipment and payroll as mentioned? Trayhttps://www.blogger.com/profile/05684157368576616356noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-57955769210406187722019-04-05T07:37:25.479-07:002019-04-05T07:37:25.479-07:00Steve, I have also asked John about these settings...Steve, I have also asked John about these settings. I keep meaning to do my own tests when I have free time (hahaha). I have this gut felling that it can improve performance if targeted correctly.<br /><br />Test Scenarios on a DEV Server<br />I am thinking of some of the GP tables accessed during user logon. Small and often accessed. <br />Then some testing on a small handful of commonly hit master tables. If anyone has a favorite table they believe might be worth testing, drop me a note.<br />DavidDavidMOhttps://www.blogger.com/profile/00709326848258085444noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-73129180836979947282019-02-08T08:44:18.496-08:002019-02-08T08:44:18.496-08:00Hi Steve,
I was searching for this topic and came ...Hi Steve,<br />I was searching for this topic and came across your post.. Needless to say, I've a customer in that situation that has 2 GP instances on 2 separate servers at 2 different versions and wants to merge everything.. So far my research showed that Named Instances doesn't seem to be the solution and it will either be a new setup of 2 SQL instances to keep them separated, or a re-implementation for one of the instance, as their support has lapsed and is no longer current on that instance.. Beat BUCHERhttps://www.blogger.com/profile/07394259942485626115noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-64995033166209171192019-01-02T08:43:36.088-08:002019-01-02T08:43:36.088-08:00Hi seann,
Have you tested Azure RAID with a B-ser...Hi seann,<br /><br />Have you tested Azure RAID with a B-series VM?<br /><br />The issue I discuss in my post is not about disk performance or disk throughput, it is the artificial IO throttling imposed on B-series VMs, regardless of disk type or configuration.<br /><br />If you have tested Azure RAID with a B-series VM, I would be interested in seeing your performance results comparing a single disk with a RAID array.<br /><br />SteveSteve Endowhttps://www.blogger.com/profile/03950475674093020502noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-24868132319314313992018-12-30T21:37:28.878-08:002018-12-30T21:37:28.878-08:00I don't see any mention of RAID in the article...I don't see any mention of RAID in the article, you should be configuring 2 or more disks in a striped RAID.<br /><br />https://dennymichael.net/2017/02/24/how-to-create-striped-disk-on-azure-aka-raid-0/seannhttps://www.blogger.com/profile/01704756690738303677noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-38317448087380073992018-12-19T15:30:56.676-08:002018-12-19T15:30:56.676-08:00Can try the alternative solution below:
https://w...Can try the alternative solution below:<br /><br />https://www.eonesolutions.com/help-article/smartconnecteconnect-not-create-default-distributions-payables-manual-checks/Edward Chinhttps://www.blogger.com/profile/10385635192048182733noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-78243154012536137562018-12-17T03:03:58.164-08:002018-12-17T03:03:58.164-08:00Hi
Has anyone had any success with the AutoMapper...Hi<br /><br />Has anyone had any success with the AutoMapper approach when it comes to complex types? It works fine for simple DTOs but doesn't seem to work when you have nested objects.<br /><br />Take the following representation of a person<br />First Name: "Pete "<br />Surname: "Littlewood "<br /><br />Json contains a name object and a display name property that is simply a concatenation of the two fields. The above AutoMapper configuration will only trim the display name property<br /><br />{<br /> "name": {<br /> "first": "Pete ",<br /> "surname": "Littlewood "<br /> },<br /> "displayname": "Pete Littlewood"<br />}<br /><br />Is there a simple way to hit all string properties regardless of DTO complexity?<br />petelittlewoodhttps://www.blogger.com/profile/09031358318466605874noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-40653483401142482242018-11-30T13:23:43.042-08:002018-11-30T13:23:43.042-08:00Hi Chad,
I just setup a code loop that gets 1,00...Hi Chad, <br /><br />I just setup a code loop that gets 1,000 JE numbers using the eConnect GetNextGLJournalEntryNumber method.<br /><br />While testing, I was able to get one SQL Exception when I opened the GP JE window and tried to get a new JE number. But I'm unable to reproduce that error again.<br /><br />I just ran 3 instances of my app, and each instance retrieved 1,000 JE numbers simultaneously without error. As far as I can tell, the eConnect GetNextGLJournalEntryNumber method is pretty good at handling volume.<br /><br />If you are seeing The pipe is being closed, that sounds more like the eConnect idle timeout.<br /><br />SteveSteve Endowhttps://www.blogger.com/profile/03950475674093020502noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-1743705180915581652018-11-30T12:32:56.735-08:002018-11-30T12:32:56.735-08:00Hi Riaan,
The taRMApply procedure will sometimes ...Hi Riaan,<br /><br />The taRMApply procedure will sometimes create GL JE adjustments, depending on how the payment application is performed. <br /><br />Are you specifying an amount for Discount Taken or Write Off Amount in your RM Apply? I believe those can result in a JE being created as part of the apply process.<br /><br />SteveSteve Endowhttps://www.blogger.com/profile/03950475674093020502noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-61214130436433342722018-11-30T11:51:00.107-08:002018-11-30T11:51:00.107-08:00Hi Chad,
I have not seen any other issues with Ge...Hi Chad,<br /><br />I have not seen any other issues with GetNextGLJournalEntryNumber (or any other get next number methods) with the .NET version of eConnect.<br /><br />Does the error happen in the middle of an integration? Or does it happen on the first transaction that the integration tries to process?<br /><br />SteveSteve Endowhttps://www.blogger.com/profile/03950475674093020502noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-45868736853165597712018-11-28T14:01:14.725-08:002018-11-28T14:01:14.725-08:00Have you diagnosed performance issues or errors ti...Have you diagnosed performance issues or errors tied to the GetNextGLJournalEntryNumber method with GP 2015? I've got a case of a "The pipe is being closed" error with this method in the stack trace.Anonymoushttps://www.blogger.com/profile/15819866213431806972noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-2078833754455909712018-11-27T23:58:10.632-08:002018-11-27T23:58:10.632-08:00Hi Steve,
I’m currently experiencing problems wit...Hi Steve,<br /><br />I’m currently experiencing problems with the Dynamics GP econnect procedure taRMApply(). When passing values where APFRDCTY = 8 or 9 it processes but when I pass APFRDCTY=7, it throws an error code 680 which states that “A null was found in at least one input parameter for GL Trx Line Insert” . I’m not using the taGLTransactionLineInsert procedure so I don’t understand why I get this error. <br /><br />Please could you assist?<br />Anonymoushttps://www.blogger.com/profile/05880991219608193197noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-21858270572293339742018-11-27T05:28:56.741-08:002018-11-27T05:28:56.741-08:00Hi Steve
I am having a problem with my econnect ...Hi Steve <br /><br />I am having a problem with my econnect taRMApply. When processing APFRDCTY = 8 or 9, it works through GP, but when APFRDCTY =7 <br /><br />It throws error code 680 where the error is <br />taGLTransactionLineInsert which says :<br />A null was found in at least one input parameter for GL Trx Line Insert. Why is this procedure taGLTransactionLineInsert throwing an error when im using taRMApply?<br /><br />Thanks Chawa Ngomahttps://www.blogger.com/profile/13471197063931414433noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-56162385626131520442018-11-15T07:14:27.963-08:002018-11-15T07:14:27.963-08:00Hi Christina
Thank you for the tip!
I am trying ou...Hi Christina<br />Thank you for the tip!<br />I am trying out the extended pricing with price sheets in different currencies (USD and SGD). There is no marking for convert functional price in SOP.<br /><br />I was trying to do an invoice in SOP for Aaron Fitz in Fabrikam and selected SGD.<br />Noted that the price was calculated using SGD pricelist converted to USD.<br /><br />When it is marked, it does not have any difference in behavior because the invoice header is SGD...<br />Both scenarios give the wrong price.<br />Appreciate if you can provide some guidance?<br /><br />Thank you so much<br /><br />Anniehttps://www.blogger.com/profile/13852697494935537261noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-67265085636391379672018-11-08T04:30:15.974-08:002018-11-08T04:30:15.974-08:00SET NOCOUNT ON
CREATE TABLE ##COMPANIES
([DYNAM...SET NOCOUNT ON <br /><br />CREATE TABLE ##COMPANIES <br />([DYNAMICS DB MASTER COMPANY ID] INT,<br />[COMPANY DATABASE COMPANY ID] INT NULL,<br />[COMPANY CODE] VARCHAR(50),<br />[COMPANY NAME] VARCHAR(50),<br />)<br /><br />INSERT ##COMPANIES<br />SELECT [CMPANYID] 'DYNAMICS DB MASTER COMPANY ID'<br /> , NULL<br /> ,[INTERID] 'COMPANY CODE' <br /> ,[CMPNYNAM] 'COMPANY NAME'<br />FROM [DYNAMICS].[dbo].[SY01500]<br /><br />CREATE TABLE ##CCODES (ID INT identity(1,1), [COMPANY CODE] VARCHAR(50))<br /><br />INSERT ##CCODES <br />SELECT LTRIM(RTRIM([COMPANY CODE])) FROM ##COMPANIES<br /><br />DECLARE @NUM_CMPS INT<br />SELECT @NUM_CMPS = COUNT(1) FROM ##CCODES<br /><br />DECLARE @COUNTER INT<br />SELECT @COUNTER = 1<br /><br />DECLARE @CURRENTCOMPANYCODE VARCHAR(50)<br />DECLARE @GETSQLSTATEMENT VARCHAR(250)<br />DECLARE @CMPNYDBCOMPANYID INT<br /><br /><br />WHILE @COUNTER <= @NUM_CMPS<br />BEGIN <br /><br />SELECT @CURRENTCOMPANYCODE = [COMPANY CODE] FROM ##CCODES WHERE ID=@COUNTER<br />SELECT @GETSQLSTATEMENT = 'SELECT CMPANYID CMPANYDBCMPANYID INTO ##CURRENTCOMPANY FROM ['+ @CURRENTCOMPANYCODE + '].dbo.[SY00100] GO'<br />EXEC (@GETSQLSTATEMENT)<br />SELECT @CMPNYDBCOMPANYID = CMPANYDBCMPANYID FROM ##CURRENTCOMPANY<br />UPDATE ##COMPANIES SET [COMPANY DATABASE COMPANY ID] = @CMPNYDBCOMPANYID WHERE [COMPANY CODE] = @CURRENTCOMPANYCODE<br />DROP TABLE ##CURRENTCOMPANY <br />SELECT @COUNTER = @COUNTER + 1 <br />END<br /><br />SELECT * FROM ##COMPANIES WHERE [DYNAMICS DB MASTER COMPANY ID] <> [COMPANY DATABASE COMPANY ID]<br /><br />DROP TABLE ##COMPANIES<br />DROP TABLE ##CCODESDusselfoodhttps://www.blogger.com/profile/02541014488598243064noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-26917613016872479712018-11-06T17:00:35.398-08:002018-11-06T17:00:35.398-08:00@MarkoLarko, I feel you pain!
I would contact Sy...@MarkoLarko, I feel you pain! <br /><br />I would contact Synology support. They should be able to assess the situation and help you determine next steps. Steve Endowhttps://www.blogger.com/profile/03950475674093020502noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-35960861639630435152018-11-06T07:54:04.505-08:002018-11-06T07:54:04.505-08:00All these tales of woe!
What an interesting Blog. ...All these tales of woe!<br />What an interesting Blog. Firstly it was a joy to read because you were able to encapsulate all the mental and emotional trauma that a failing Synology (NAS) has on one....especially when you realise that your backup strategy is unsustainable. However moving swiftly on ...<br />I have a DS 1515+ with an expansion bay with 10 x 4tb drives with one Hot Standby. (That is a completely filled Synology = 5+ 5) in all 27.9 Tb of which I am using 19.3 Tb.<br /><br />Three days ago I had a warning saying that my Synology was crashed! Crashed in RED!<br />I could feel the adrenaline surge and it was not a good feeling.<br />(90,000 photos, 48,000 music tracks and 1,000 movies).<br /><br />I immediately went online and realised that the network access was available and I immediately copied my Photos to another external drive. Followed this with the music library. Went to bed and mysteriously nearly found myself wanting to kneel beside my inviting bed but the feeling passed very quickly. Didn't sleep well that night (2 nights ago!)<br />In the morning the external 'HDD' which I was copying the Photos to was 'CLICKING' in an ominous joking sort of repetitive way. F***!<br /><br />The Synology was still active but I could only see the File Structure of my NAS - none of the contents were available or visable!<br /><br />I decided that I needed to shut down the NAS as there was no alternative. The NAS was as good dead.<br /><br />So with a little trepidation and feeling gungho at the same time because 'shit had happened' I pressed the power button on the 1515+ When I rebooted a few minutes later the drives started flashing except for the two 'failed' ones. (They were Bay 5 and Bay 5 on the expansion box)<br /><br />I noticed that the drives LED were no longer Orange/Amber. They were Green - but were not flashing. The rest of the drives were going mad. I logged on and 'Wooow' I had my Synology back and the error message was 'Repairing'<br /><br />The Green lights on Bay 5 and Bay 5 Ex are now flickering away as if they were part of the recovery process. This I thought was pretty strange as to all intent and purposes they should be isolated from the building process. Maybe I am wrong? <br />Do I assume from the lights on the drives that the rebuild process still has access to the data? I originally had 9 HDDs with one hot spare (HS). The HS seems to have picked up the slack but what of the other two drives? One is saying 'System Partition Failed' and the other is saying ''Not Initialised' with a health status of 'Warning' with 1 Bad Sector.<br /><br />How you can get a Warning for a Sector when the disc is supposedly 'Not Initialised' I have no idea?<br /><br />SO I have ordered a spare Red 4Tb drive (Not next day delivery as this machine is only 50% rebuilt after 35 hours!) and by the time it arrives everything will hopefully be back to 'normal'<br /><br />So my question to all you readers is what do I do next?<br /><br />I will have 2 HDDs which are suspect and a spare on the way. Do I remove (Hot Swap) the Not Initialsed HDD with the new HDD or what? <br /><br />I am confused as to whether the HDD are failed or just a bit 'muddled' and need to be reformatted / cleaned ready for re-insertion? The real problem is that I can find no definitive advice on the best way of proceeding.<br /><br />I have a day or two to contemplate this but if anybody any bright ideas for a frustrated Cornish Grandfather then I will be all ears.MarkLarkohttps://www.blogger.com/profile/11703426743464451578noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-19820198117847062182018-10-27T23:47:40.655-07:002018-10-27T23:47:40.655-07:00"This lovely article" is no longer there..."This lovely article" is no longer there, link is broken already..CWM Userhttps://www.blogger.com/profile/00315254939924201715noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-84514885422354768992018-10-27T08:07:23.410-07:002018-10-27T08:07:23.410-07:00I am trying to use table import to import weekly p...I am trying to use table import to import weekly payroll hours including jobs. We use Wennsoft for job costs. Can someone show me how to build the tables?Anonymoushttps://www.blogger.com/profile/17617070193339781643noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-74281857838215007982018-10-24T23:33:03.153-07:002018-10-24T23:33:03.153-07:00Yes it works
Yes it works<br />Anonymoushttps://www.blogger.com/profile/07364591286606164071noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-12453659543858061072018-10-23T09:10:08.557-07:002018-10-23T09:10:08.557-07:00Hi Alex,
Thanks for the comment. A few other Dex ...Hi Alex,<br /><br />Thanks for the comment. A few other Dex folks have explained to me that the POP OpenWindow is available under Functions, so I have tested that and got it working.<br /><br />Thanks for the reminder that I had noted that issue on this post. I've updated the post with the pointer to use Functions rather than Procedures.<br /><br />Thanks!<br /><br />SteveSteve Endowhttps://www.blogger.com/profile/03950475674093020502noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-70977397842516390832018-10-23T08:42:54.768-07:002018-10-23T08:42:54.768-07:00Testing with GP 2018 as we're preparing to do ...Testing with GP 2018 as we're preparing to do the upgrade. The slowness (even with January Hotfix) was unacceptable and this looks like a good fix. Also, the flexibility of the search window adds functionality beyond what the built in forms do and our users are excited about that.Anonymoushttps://www.blogger.com/profile/16533344342567971906noreply@blogger.comtag:blogger.com,1999:blog-6691994129222744759.post-42631980537387784282018-10-23T08:17:26.974-07:002018-10-23T08:17:26.974-07:00Hi Steve. I enjoyed your post on opening Dynamics ...Hi Steve. I enjoyed your post on opening Dynamics GP window from VSTools. Just to say that the problem with opening the PopInquiryInvoiceEntry window is because OpenWindow is a function on that window, not a procedure. So if you look in PopInquiryInvoiceEntry.Functions you will find OpenWindow. Hope that helps!Anonymousnoreply@blogger.com