Once again doing battle with Dex ranges.
Kept getting the dreaded "error get/change table...a record was already locked" during a loop on a table with a range where clause. After the change next on the second record in the range I got the error when attempting to read the third record with a subsequent change next in the loop.
The problem was I did nothing with the second record (by design). Apparently in every change statement the table record must be saved or removed.
When I added a save command for every record in the loop the error went away.
I'm a former Microsoft Dynamics consultant specializing in SQL Server integrations, reports and customizations.
Wednesday, November 20, 2019
Tuesday, November 19, 2019
Dynamics GP Payables Apply malfunction
We had a payables payment in GP go bad on us. When it was put in as a Manual Payment some documents were applied. This was a partial application and amounts were still outstanding.
However some of these applied documents got disconnected and were left behind. They showed on the Apply Payables Documents window as being selected with the manual payment, but we could not make any further selections. When we tried GP kicked the following error:
"This invoice already is partially applied to this credit document. You cannot apply this document until the other apply amount is posted."
hmmmm....
Found some references to the GP apply tables. Specifically PM10201, the Payment Apply To Work File, and PM20100, the Apply To OPEN OPEN Temporary File.
For some reason these tables have a LOT of junk in them. I'm not ever sure how to start cleaning them out.
But for purposes of resolving the current problem I found a bad payment referenced in PM10201 for the invoices that were showing as applied. I deleted those records and removed the related records from PM20100. When I ran CheckLinks the apply documents moved to history and all appears back to normal.
Some day there will be a reckoning with those two tables. It's just a mess.
However some of these applied documents got disconnected and were left behind. They showed on the Apply Payables Documents window as being selected with the manual payment, but we could not make any further selections. When we tried GP kicked the following error:
"This invoice already is partially applied to this credit document. You cannot apply this document until the other apply amount is posted."
hmmmm....
Found some references to the GP apply tables. Specifically PM10201, the Payment Apply To Work File, and PM20100, the Apply To OPEN OPEN Temporary File.
For some reason these tables have a LOT of junk in them. I'm not ever sure how to start cleaning them out.
But for purposes of resolving the current problem I found a bad payment referenced in PM10201 for the invoices that were showing as applied. I deleted those records and removed the related records from PM20100. When I ran CheckLinks the apply documents moved to history and all appears back to normal.
Some day there will be a reckoning with those two tables. It's just a mess.
Tuesday, August 27, 2019
Dynamics GP Attachments - 2 strange animals
We have been jumping into document attachment in GP. Along the way we encountered some odd issues. Part of this odd-ness comes from the fact that there are actually 2 separate document attachment processes in GP - one for the main GP dictionary and one for the HR dictionary.
As we have been doing most attachments in the HR/PR area I've quickly realized that this one is a bit under-documented. For example there are plenty of citations to the main dictionary attachment tables, which range from CO00101 to CO00105. However I can find few references to the HR attachment tables.
By using GP's Table Finder (on any GP window click Tools > Table Finder) you can find all kinds of juicy info about what you are looking at.
Notice that the box in the bottom left portion of the window shows the Tables involved in this window. Also notice the tables for this window (HR Attachments) are HR tables. Furthermore the "Product" is indicated to be Human Resources (HR.) Normally I use the TableList.xls document in the GP SDK to find tables, but these are not listed there. Oddly these tables are all in the DYNAMICS database.
HR_AttchRef (HRREF003)
HR_AttachMSTR (HRATT001)
HR_AttachStatus (HRSTA002)
HRAttachmentItems
So the Product being HR is a big clue that there are two separate processes. This is due to the dictionary architecture of GP and the fact that HR is still a separate dictionary. While I understand this is the easy way to do this I wonder how hard it would have been to stick with just one attachment process in the main dictionary?
This has interesting implications. For example in GP under Administration > Setup > Company the Document Attach Setup window has some options to select related to attachments. However these likely only apply to the main dictionary attachments. We've not seen any of those options apply to attachments in the HR area.
Probably the worst feature of Doc Attach in GP is the removal. It is virtually impossible to permanently delete a single attachment. When you delete from the Attach window it moves to the Delete tab. From there it can only be removed using the Delete Attachment Files, which is different for the main dictionary and HR. Further that utility only allows removal with a Date cutoff.
So it's likely my next enhancement to GP will be to allow removal of individual attachments.
As we have been doing most attachments in the HR/PR area I've quickly realized that this one is a bit under-documented. For example there are plenty of citations to the main dictionary attachment tables, which range from CO00101 to CO00105. However I can find few references to the HR attachment tables.
By using GP's Table Finder (on any GP window click Tools > Table Finder) you can find all kinds of juicy info about what you are looking at.
Notice that the box in the bottom left portion of the window shows the Tables involved in this window. Also notice the tables for this window (HR Attachments) are HR tables. Furthermore the "Product" is indicated to be Human Resources (HR.) Normally I use the TableList.xls document in the GP SDK to find tables, but these are not listed there. Oddly these tables are all in the DYNAMICS database.
HR_AttchRef (HRREF003)
HR_AttachMSTR (HRATT001)
HR_AttachStatus (HRSTA002)
HRAttachmentItems
So the Product being HR is a big clue that there are two separate processes. This is due to the dictionary architecture of GP and the fact that HR is still a separate dictionary. While I understand this is the easy way to do this I wonder how hard it would have been to stick with just one attachment process in the main dictionary?
This has interesting implications. For example in GP under Administration > Setup > Company the Document Attach Setup window has some options to select related to attachments. However these likely only apply to the main dictionary attachments. We've not seen any of those options apply to attachments in the HR area.
Probably the worst feature of Doc Attach in GP is the removal. It is virtually impossible to permanently delete a single attachment. When you delete from the Attach window it moves to the Delete tab. From there it can only be removed using the Delete Attachment Files, which is different for the main dictionary and HR. Further that utility only allows removal with a Date cutoff.
So it's likely my next enhancement to GP will be to allow removal of individual attachments.
Friday, August 23, 2019
Using Hashbytes in ETL
We use Solver's BI360 for our internal data warehousing. It works fairly well for us and is a nice hybrid solution to DW. This is not a review.
As background I was trained by Solver tech's back in 2016 to configure the ETL that moves data into the BI360 DW. They had pre-built SQL Server Integration Services (SSIS) ETL packages for our ERP solution (Dynamics GP) and we modified them to work in our environment.
One key element of the ETL was the use of the SQL BINARY_CHECKSUM (BC) function to detect changes in data. It is not practical to compare every column in the DW to all the columns in the source data so it makes sense to hash them and compare the hashes instead.
As a Google search will quickly show BC is not wholly reliable in detecting certain types of changes. For example I received an email from a user asking why an order line was not updated in our DW reports. The order line was originally entered as quantity 1 for $2,900 each and was changed to quantity 10 at $290 each. This change did not propagate to the DW and I traced it to the BC function. BC clearly did not see the difference between these two values.
This was SERIOUSLY not acceptable so I turned again to Google to find alternatives. While there may be others I settled on the HASHBYTES (HB) function. It identified the difference above that BC did not.
It is common to use the CONCAT function with HB to build the hash value. So I listed the columns inside CONCAT and used that as the input parameter to HB. Further exploration of the use of HB revealed that blank column values might not be detected in HB. For this reason it was recommended to use a field separator value - I chose | for this purpose.
There was a side-benefit to using this construct. I suppose I could have tried to used CONCAT inside BC and seen if the results were different but I didn't. CONCAT has the benefit of being able to easily compare the source and target strings. This is important in determining if I've built the comparison correctly and was very difficult to track down with how I was using BC.
So now when my updates are not completing I can grab the source and target CONCAT strings and compare them to find where I've messed up. This has worked great so far and I'm looking forward to using this moving forward. All my ETL is now converted to HB and BC is no more.
As background I was trained by Solver tech's back in 2016 to configure the ETL that moves data into the BI360 DW. They had pre-built SQL Server Integration Services (SSIS) ETL packages for our ERP solution (Dynamics GP) and we modified them to work in our environment.
One key element of the ETL was the use of the SQL BINARY_CHECKSUM (BC) function to detect changes in data. It is not practical to compare every column in the DW to all the columns in the source data so it makes sense to hash them and compare the hashes instead.
As a Google search will quickly show BC is not wholly reliable in detecting certain types of changes. For example I received an email from a user asking why an order line was not updated in our DW reports. The order line was originally entered as quantity 1 for $2,900 each and was changed to quantity 10 at $290 each. This change did not propagate to the DW and I traced it to the BC function. BC clearly did not see the difference between these two values.
This was SERIOUSLY not acceptable so I turned again to Google to find alternatives. While there may be others I settled on the HASHBYTES (HB) function. It identified the difference above that BC did not.
It is common to use the CONCAT function with HB to build the hash value. So I listed the columns inside CONCAT and used that as the input parameter to HB. Further exploration of the use of HB revealed that blank column values might not be detected in HB. For this reason it was recommended to use a field separator value - I chose | for this purpose.
There was a side-benefit to using this construct. I suppose I could have tried to used CONCAT inside BC and seen if the results were different but I didn't. CONCAT has the benefit of being able to easily compare the source and target strings. This is important in determining if I've built the comparison correctly and was very difficult to track down with how I was using BC.
So now when my updates are not completing I can grab the source and target CONCAT strings and compare them to find where I've messed up. This has worked great so far and I'm looking forward to using this moving forward. All my ETL is now converted to HB and BC is no more.
Monday, June 24, 2019
eConnect error: Unable to update the existing SOP Header record
I was receiving said error from my Dex enhancement which uses eConnect to update SOP transaction data from my custom window. The above error, along with the 2218 error code indicates ABSOLUTELY NOTHING!
Here is the trick to solving this. Change the transaction from an update to an insert by way of the UpdateExisting flag.
My original transaction was an Update and eConnect simply replies "No". Changing this to an Insert yielded a message to the fact that the salesperson was inactivated. Activating the salesperson allowed the update to proceed just fine.
Interestingly this only stopped the SOP update via eConnect. When changing the data inside SOP Transaction Entry it saved just fine.
Why does an inactive salesperson stop an eConnect update of SOP but not an update via GP?
*** Update 1/29/2020
We found another situation where this can happen. In our new case the above fix DID NOT show us what the problem was. It was only by changing various eConnect inputs did I find the cause.
Apparently there was a problem with Sales Tax on the order. I'm still trying to home in on the exact problem but here's the gist.
There is a flag in eConnect that is CREATETAXES. My enhancement sets this value to 0 unless the user changes a value related to sales tax. In this case no sales tax amounts were being updated, so the value passed was remaining 0. However for some reason (to be determined) GP REQUIRED this flag to be set for this order. When I manually overrode this flag the error ceased and the sales tax on my order was adjusted up by $.02.
In contact with my order entry person I found she had manually updated the sales tax via the Sales Transaction Entry window to override the sales tax amount (reducing it by $.02) prior to getting this error.
Regardless after my manual override we could not attempt to recreate the error so it remains a mystery, but at least we have a solution.
I will be working to try to identify exactly what caused this error but realized in this instance that this error might occur for reasons that can't easily be discerned and only hard work and elbow grease will track down the cause.
Here is the trick to solving this. Change the transaction from an update to an insert by way of the UpdateExisting flag.
My original transaction was an Update and eConnect simply replies "No". Changing this to an Insert yielded a message to the fact that the salesperson was inactivated. Activating the salesperson allowed the update to proceed just fine.
Interestingly this only stopped the SOP update via eConnect. When changing the data inside SOP Transaction Entry it saved just fine.
Why does an inactive salesperson stop an eConnect update of SOP but not an update via GP?
*** Update 1/29/2020
We found another situation where this can happen. In our new case the above fix DID NOT show us what the problem was. It was only by changing various eConnect inputs did I find the cause.
Apparently there was a problem with Sales Tax on the order. I'm still trying to home in on the exact problem but here's the gist.
There is a flag in eConnect that is CREATETAXES. My enhancement sets this value to 0 unless the user changes a value related to sales tax. In this case no sales tax amounts were being updated, so the value passed was remaining 0. However for some reason (to be determined) GP REQUIRED this flag to be set for this order. When I manually overrode this flag the error ceased and the sales tax on my order was adjusted up by $.02.
In contact with my order entry person I found she had manually updated the sales tax via the Sales Transaction Entry window to override the sales tax amount (reducing it by $.02) prior to getting this error.
Regardless after my manual override we could not attempt to recreate the error so it remains a mystery, but at least we have a solution.
I will be working to try to identify exactly what caused this error but realized in this instance that this error might occur for reasons that can't easily be discerned and only hard work and elbow grease will track down the cause.
Tuesday, May 14, 2019
Dexterity "get" statement is lonely
If you've read previous blog entries here you'll know I've had problems with ranges. I'm sticking with the "range where" construct for the foreseeable future because.
I have a window that the user provided an ID field (a string) to adjust some cost data. When they provided the ID and processed their first result I clear the window then they would enter the next ID. Dex promptly kicked an error 18 - Missing.
Turns out what was missing was not the data, but a "first" in my "get" clause. Because the range included multiple results "get" cannot be used on its own. Once I provided "first" all was well.
I have a window that the user provided an ID field (a string) to adjust some cost data. When they provided the ID and processed their first result I clear the window then they would enter the next ID. Dex promptly kicked an error 18 - Missing.
Turns out what was missing was not the data, but a "first" in my "get" clause. Because the range included multiple results "get" cannot be used on its own. Once I provided "first" all was well.
Wednesday, May 8, 2019
Stored Procedure no results in Dex - results from SQL command line
I was developing code in Dexterity that involves the use of a stored procedure. Plenty of examples of this but I was banging my head trying to figure out why the code worked and provided results when I ran it from a SQL Query command, but no results from Dexterity.
I should point out that no error was ever returned! By Dex or by SQL.
Here was the problem in a nutshell - I was not providing values for all fields in the Dex table I was filling in from the stored procedure. Once I filled all table fields in the stored procedure it worked in Dex.
Lesson for future reference.
I should point out that no error was ever returned! By Dex or by SQL.
Here was the problem in a nutshell - I was not providing values for all fields in the Dex table I was filling in from the stored procedure. Once I filled all table fields in the stored procedure it worked in Dex.
Lesson for future reference.
Monday, May 6, 2019
Dynamics GP Payroll Integration to Payables
We just upgraded to GP 2018 R2. Fairly uneventful, except for the nooks and cranny's (always gotta watch for them.) One area that blindsided us was Payroll Integration to Payables (PIP.)
In our previous GP versions (we came from GP 2015 R2) PIP worked for computer checks (of course), but also for manual checks. As we processed payroll in the new version it became increasingly clear that PIP no longer worked for manual checks. WTF?
There is not a lot about PIP online, but I found a reference in the GP docs about manual checks not being supported. I don't know if that's new and they took the feature away or what but it sure doesn't sit right.
Well, being the happy Dex developer I am this just means more work for me. So off to school we go trying to figure out how to make PIP work for manual checks.
Don't get me wrong PIP is still an external dictionary and would be no fun to hack. I'm going to spool my own code to make this happen. And that starts with a whole lot of reverse engineering.
The first issue is where's the data? Well the GP SDK won't help as PIP is in a separate dictionary. Opening that DIC file in Dex shows the following tables:
PIP00400 contains the vendors and the next PIP number that will be assigned to PIP transactions.
PIP40100 stores the setup information from the Payroll Vendor Setup window.
I am focusing on PIP40100 now as that contains reference to payroll history in UPR30300 that will contain the manual checks I need to post. In that table is a column entitled APR_PR_VendorRecordType that I have deduced to contain the following values.
In our previous GP versions (we came from GP 2015 R2) PIP worked for computer checks (of course), but also for manual checks. As we processed payroll in the new version it became increasingly clear that PIP no longer worked for manual checks. WTF?
There is not a lot about PIP online, but I found a reference in the GP docs about manual checks not being supported. I don't know if that's new and they took the feature away or what but it sure doesn't sit right.
Well, being the happy Dex developer I am this just means more work for me. So off to school we go trying to figure out how to make PIP work for manual checks.
Don't get me wrong PIP is still an external dictionary and would be no fun to hack. I'm going to spool my own code to make this happen. And that starts with a whole lot of reverse engineering.
The first issue is where's the data? Well the GP SDK won't help as PIP is in a separate dictionary. Opening that DIC file in Dex shows the following tables:
PIP00400 contains the vendors and the next PIP number that will be assigned to PIP transactions.
PIP40100 stores the setup information from the Payroll Vendor Setup window.
I am focusing on PIP40100 now as that contains reference to payroll history in UPR30300 that will contain the manual checks I need to post. In that table is a column entitled APR_PR_VendorRecordType that I have deduced to contain the following values.
1=Fed TaxThose will point me to the PYRLRTYP values in UPR30300. It looks like data that posts to payables is summarized by check date. I'll have more information as I get deeper into the process but wanted to seed this in case someone else was curious.
2=State Tax
3=Local Tax
4=Company Deduct
5=Employee Deduct
6=Company Benefit
7=Employee Benefit
Thursday, March 21, 2019
Using eConnect from Dexterity - Part 3
I forgot I promised a sequel for this series regarding transactions.
I've done many of these and most include quite a bit of logic. Some have intense logic sequences, depending on the data source. I'll explain.
Transactions usually involve some sort of "Document Number". This could be a JE #, a PO #, a SO #, or some other similar thing. What's important is this is usually a unique identifier and cannot be repeated.
I mentioned in Part 2 that there are "next number" connection points in eConnect, and these might not be documented in the SDK (but can be found as "ta" stored procedures in a GP company database.) You can use those to obtain a next number, which is typically the first step required in adding a transaction (after the batch, that is.)
The next steps will depend on the fact that most transactions in GP involve detail lines and related headers. Sometimes, as certain modules require, there may be multiple types of details required. The obvious example here is if you want to supply your own GL distribution instead of the defaults from GP.
This points out another nice feature of eConnect. It will default supporting details for your transaction just like GP would if a user was doing the entry. So if you want GP to default the GL distribution you typically don't need to take any extra steps. Contrast that with going direct to tables and eConnect wins every time!
But if you do want to override the defaults and provide your own data you typically need to flag the header to NOT default the details you want (and may have already) provided. I have found these clearly indicated in the SDK for the header, if applicable.
The sequence following getting the next number will involve creating all the supporting detail records first, then creating the header. This typically involves WHILE loops and keys that support the ordering of transactions so you can insert the details, and when some key field changes, insert the header and get the next number.
I will admit sometimes the logic required to get the steps right in Dex can be involved. And it's important to step into it and proceed with suitable test data.
And furthermore I would suggest testing your inserts from Query Analyzer in SQL Server. (You are using a test server, right?) This will tell you that your handling of the eConnect calls work before you start with Dexterity.
There's another reason I like using Dexterity for this: Because I don't like to use cursors in SQL Server. Actually I despise them, so doing the work programmatically in Dex is better for my soul! I have also found that this process happens FAST so user wait times are negligible.
And there is no way you can start on this if you don't understand what the UI requires, and what the back-end involves. So you will need to obtain plenty of sample transactions to process, run these through the GP UI, and observe the results in the tables when saved before you even start coding. Then you will be able to approach your integration with the understanding of where you are heading.
I hope I've laid the ground-work for a simple transaction example, which I'll do as the next installment in this series.
And again your comments will encourage me to keep up with this so let's hear it!
I've done many of these and most include quite a bit of logic. Some have intense logic sequences, depending on the data source. I'll explain.
Transactions usually involve some sort of "Document Number". This could be a JE #, a PO #, a SO #, or some other similar thing. What's important is this is usually a unique identifier and cannot be repeated.
I mentioned in Part 2 that there are "next number" connection points in eConnect, and these might not be documented in the SDK (but can be found as "ta" stored procedures in a GP company database.) You can use those to obtain a next number, which is typically the first step required in adding a transaction (after the batch, that is.)
The next steps will depend on the fact that most transactions in GP involve detail lines and related headers. Sometimes, as certain modules require, there may be multiple types of details required. The obvious example here is if you want to supply your own GL distribution instead of the defaults from GP.
This points out another nice feature of eConnect. It will default supporting details for your transaction just like GP would if a user was doing the entry. So if you want GP to default the GL distribution you typically don't need to take any extra steps. Contrast that with going direct to tables and eConnect wins every time!
But if you do want to override the defaults and provide your own data you typically need to flag the header to NOT default the details you want (and may have already) provided. I have found these clearly indicated in the SDK for the header, if applicable.
The sequence following getting the next number will involve creating all the supporting detail records first, then creating the header. This typically involves WHILE loops and keys that support the ordering of transactions so you can insert the details, and when some key field changes, insert the header and get the next number.
I will admit sometimes the logic required to get the steps right in Dex can be involved. And it's important to step into it and proceed with suitable test data.
And furthermore I would suggest testing your inserts from Query Analyzer in SQL Server. (You are using a test server, right?) This will tell you that your handling of the eConnect calls work before you start with Dexterity.
There's another reason I like using Dexterity for this: Because I don't like to use cursors in SQL Server. Actually I despise them, so doing the work programmatically in Dex is better for my soul! I have also found that this process happens FAST so user wait times are negligible.
And there is no way you can start on this if you don't understand what the UI requires, and what the back-end involves. So you will need to obtain plenty of sample transactions to process, run these through the GP UI, and observe the results in the tables when saved before you even start coding. Then you will be able to approach your integration with the understanding of where you are heading.
I hope I've laid the ground-work for a simple transaction example, which I'll do as the next installment in this series.
And again your comments will encourage me to keep up with this so let's hear it!
Tuesday, March 19, 2019
Dexterity custom windows not showing up in Quick Links
I have been rolling out new Dex windows to simplify Dynamics GP for our users. In doing so I would like to allow users to add my custom windows to Quick Links on their Home Page.
I tried this with a user and could not find a recent custom window I added. This is done through the "Modify" (pencil) icon at the top of the Quick Links window on the Home Page. That lists your current Links and has an Add button/drop list that you can use to add new windows to the Quick Links list via the Add Command window.
Looking deeper into this I could only find several of my custom windows in the Add Command list. In addition some of these were located from places that were not correct (for example an AP Transaction window under GL Utilities.)
This made no sense at all. Time to pop the hood!
Digging even deeper I found, via the Tools > Table Finder window that the likely culprit involved is the Dynamics SY07110 (syMenuMstr) table. Doing Script debugging and DexSQL logging around the Quick Links Add function showed this table being hit extensively.
Looking at SY07110 it's clear this table has pointers to dictionaries and forms. I found the ID of my dictionary and noticed only 11 entries in this table. Believe me I've got a few more than 11 windows in my enhancement!
Googling this table I found nothing definitive but several references to removing records from that table and allowing GP to add them again. WTH!
I did this in my Dev box, removed all the entries in SY07110 corresponding to my dictionary ID, and VOILA! Problem solved!
Now all my windows are listed in the proper place when I try to add to Quick Links.
YAY!
Hoping this helps someone else down the line.
I tried this with a user and could not find a recent custom window I added. This is done through the "Modify" (pencil) icon at the top of the Quick Links window on the Home Page. That lists your current Links and has an Add button/drop list that you can use to add new windows to the Quick Links list via the Add Command window.
Looking deeper into this I could only find several of my custom windows in the Add Command list. In addition some of these were located from places that were not correct (for example an AP Transaction window under GL Utilities.)
This made no sense at all. Time to pop the hood!
Digging even deeper I found, via the Tools > Table Finder window that the likely culprit involved is the Dynamics SY07110 (syMenuMstr) table. Doing Script debugging and DexSQL logging around the Quick Links Add function showed this table being hit extensively.
Looking at SY07110 it's clear this table has pointers to dictionaries and forms. I found the ID of my dictionary and noticed only 11 entries in this table. Believe me I've got a few more than 11 windows in my enhancement!
Googling this table I found nothing definitive but several references to removing records from that table and allowing GP to add them again. WTH!
I did this in my Dev box, removed all the entries in SY07110 corresponding to my dictionary ID, and VOILA! Problem solved!
Now all my windows are listed in the proper place when I try to add to Quick Links.
YAY!
Hoping this helps someone else down the line.
Monday, March 11, 2019
Insert line in Dexterity scrolling window
I'm still warming up to Dexterity scrolling windows. Combined with table buffers the two have given me fits for a while. While filling and deploying scrolling windows seems to work okay, I've had to resort to using RANGE WHERE to load the scrolling window because of problems I've had with this feature.
In attempting to develop a replacement SOP Order Entry window I ran into the requirement for numbering and ordering the data in the scrolling window. This turns out to be a lot harder than it looks!
I always rely on the sample code provided in the manuals. In this case the Programmer's Guide Volume 2 explains the use of scrolling windows and provides sample scripts for reference. However I simply could not get the sample scripts to work.
Back against the wall I had to roll my own. Turns out it's not so hard if you know what you're doing (and I'm getting there.)
Here is the code from the PG v2 p104:
local currency temp;The odd statement to me is the line that reads get prev. I know what this statement does in the sample code - backs up to the previous line to get that line's number, then adds it to the line we are to insert before and divides the sum by 2. Should put us right in the middle of the two numbers for the new line, right?
if not empty('Sequence Number' of table Invoice_Data) then
{Store the sequence number of the current record.}
temp = 'Sequence Number' of table Invoice_Data;
{Read the sequence number of the previous record.}
get prev table Invoice_Data;
if err() <> EOF thentemp = temp + 'Sequence Number' of table Invoice_Data;
end if;
temp = temp / 2;
clear table Invoice_Data;
{Save the new record in the table.}
'Invoice Number' of table Invoice_Data = 'Invoice Number';
'Sequence Number' of table Invoice_Data = temp;
save table Invoice_Data;
end if;
Except it does not. Seriously?! I certainly could be missing something but this flat-out does not work. I get error 16, end-of-file. WTF?!
Again scrolling windows and table buffers are not my forte', so I cooked up a Plan B for this that seems to work well.
Somewhere in that user guide it says that form-level procedures use their own table buffers. This is key as it avoids conflicts with the scrolling window table buffer. So I can do things in form scripts that avoid messing with my scrolling window table, which can earn you funky results in the window (and I HAVE seen FUNKY results.)
I found that if I use a form-level function and pass it the key of the current scrolling line (which includes the sequence number I'm trying to split) I can return the split sequence number to the Insert script so the new line gets placed appropriately.
In pseudo code here's how the form function will work:
- Set the appropriate range on the scrolling window (for the key provided.)
- Read the sequence numbers in order and save each number until the next row.
- When you reach the sequence number provided stop and average the two.
The only trick is if the first row is passed as the sequence. In that case simply halve it and pass that back. Here is my version of this code (works so far:)
function returns currency nextsequence;Notice I'm not using the GP SOP Detail Item table. I don't want to be subject to orphan detail lines from this window so I do the work to bring the existing detail in then write it back and update SOP Detail. Quite a bit of extra work but it provides true abstraction and allows the user to clear their changes without the dreaded "Save or Delete" do-or-die option (that SOP users know all too well.)
in string ordernumber;
in currency currentsequence;
local currency calcseq;
range clear table SOPLineTemp;
clear table SOPLineTemp;
'SOP Number' of table SOPLineTemp = ordernumber;
'User ID' of table SOPLineTemp = 'User ID' of globals;
range start table SOPLineTemp by SOPLineTemp_Sequence;
fill table SOPLineTemp;
'SOP Number' of table SOPLineTemp = ordernumber;
'User ID' of table SOPLineTemp = 'User ID' of globals;
range end table SOPLineTemp by SOPLineTemp_Sequence;
get first table SOPLineTemp by SOPLineTemp_Sequence;
calcseq = 'Line Item Sequence' of table SOPLineTemp;
if calcseq = currentsequence then
calcseq = calcseq / 2;
end if;
while err() = OKAY and 'Line Item Sequence' of table SOPLineTemp <= currentsequence do
if 'Line Item Sequence' of table SOPLineTemp = currentsequence then
calcseq = ('Line Item Sequence' of table SOPLineTemp + calcseq) / 2;
else
calcseq = 'Line Item Sequence' of table SOPLineTemp;
end if;
get next table SOPLineTemp by SOPLineTemp_Sequence;
end while;
nextsequence = calcseq;
I would also like to take a minute to point out SOME of the MAJOR flaws in SOP entry that my window avoids:
- Zero master numbers and note ID's.
- Disparate tax, salesperson and ship-to codes (none of which differ EVER in our orders.)
- Data scattered across 5 different windows (I have placed all our fields on a single window.)
Also I make good use of eConnect to handle ALL my SOP inserts and updates. And we have a number of custom fields that reside in a side-car table (and rely on the master number so avoiding the zero master number is a BIG PLUS!)
One last thing. In the Line Insert script I had to use a local variable to contain the result of this function. I then assign the local variable to the row being inserted. For whatever reason this had to be done or it simply would not work.
I hope this helps your coding efforts. If you poke any holes in this please let me know in the comments. This is going to user testing tomorrow, but I think I've beat on it pretty hard so far.
Tuesday, January 8, 2019
Dynamics GP 2015R2 to 2018R2 upgrade - Upgrade!
This is part 3 of my journey to upgrade GP.
Let the fun begin! It is time to do the base upgrade. We all know how that starts right?
GP Utilities.
As SA, of course.
As this was an existing Dynamics installation upgrade I just had to click a few Next buttons right?
Not so fast!
I didn't even get past the Dynamics database upgrade when I received the nasty message:
"The stored procedure SynchronizeTableData() of form duSQLAccountSynch : 27Pass Through SQL returned the following results: DBMS: 2601, Microsoft Dynamics GP: 0."
Turns out I'm not alone with that one. Google will return mucho resultso. However the only one you need is here. This one goes back quite a few versions, but I don't recall actually seeing it before. It's apparently not common, but if you modify right Smartlist you get bit (Payroll > Employee?).
The problem lies in the Dynamics table ASIEXP86. Log story short backup that table then clear it then restart the upgrade. Do that after restoring the Dynamics database, of course.
That did get me past into the company upgrades. As we have 7 companies I watched for a while and made sure our primary company database upgraded, then disconnected from the server to let it run to completion.
As that was past quitting time I jumped back in this morning to see how things went. Oddly when I logged in there was no GP Utilities session waiting for me to click the Finish button. WTF right?
A check of the Windows logs pointed me to the fellow in IT that pulled a restart on me. Claims he didn't see anyone logged on. False!
Ah well, bygones be bygones right? It's been a while since I had to restart an upgrade and since 3 companies had already completed I just had to clear the Dynamics..DULCK table and restart Utilities to finish the job.
All went well until the last company finished then the GP Utilities window showed blank. Checking the Task Manager revealed an odd pattern of CPU utilization by the session, indicating that the process was still running - but the blank window was odd.
As I left to come and start this blog I was interrupted regarding a report issue. As I was rendering assistance I noticed a MAJOR clue - I had folders in SSRS for all my companies. AHA!!!
I had failed to clear the SSRS setting in the Dynamics database after I had restored it following the first error. That was in my script but I'd overlooked it on the restore.
Returning to my GP Utilities session revealed the nasty GP Utilities window telling me my "Business Intelligence reports were not deployed." - Buzz off!
I've got some cleanup to do in my SSRS folders. We don't bother with GP's standard SSRS reports and this is something that's plagued me for years (none of my clients in my former life as a consultant bothered with them either.) GP insists on loading these "BI" reports if there is a reference to an SSRS instance. I think it's time for a suggestion to make this an optional step instead of a default process.
Of course the window is asking me to Retry or Cancel. I don't want to Retry but Cancel just allows it to keep going - trying to load SSRS reports I don't want. So it's "Hammer Time" on the program - End Task!
I hate to do that but they leave me no choice.
I was then able to start GP and get my first hands-on with GP 2018. I see that I've got a new step in my upgrade process right off-the-bat: get rid of "Intelligent Cloud Insights". Sorry but I don't care!
Another buzz-off!
Let the fun begin! It is time to do the base upgrade. We all know how that starts right?
GP Utilities.
As SA, of course.
As this was an existing Dynamics installation upgrade I just had to click a few Next buttons right?
Not so fast!
I didn't even get past the Dynamics database upgrade when I received the nasty message:
"The stored procedure SynchronizeTableData() of form duSQLAccountSynch : 27Pass Through SQL returned the following results: DBMS: 2601, Microsoft Dynamics GP: 0."
Turns out I'm not alone with that one. Google will return mucho resultso. However the only one you need is here. This one goes back quite a few versions, but I don't recall actually seeing it before. It's apparently not common, but if you modify right Smartlist you get bit (Payroll > Employee?).
The problem lies in the Dynamics table ASIEXP86. Log story short backup that table then clear it then restart the upgrade. Do that after restoring the Dynamics database, of course.
That did get me past into the company upgrades. As we have 7 companies I watched for a while and made sure our primary company database upgraded, then disconnected from the server to let it run to completion.
As that was past quitting time I jumped back in this morning to see how things went. Oddly when I logged in there was no GP Utilities session waiting for me to click the Finish button. WTF right?
A check of the Windows logs pointed me to the fellow in IT that pulled a restart on me. Claims he didn't see anyone logged on. False!
Ah well, bygones be bygones right? It's been a while since I had to restart an upgrade and since 3 companies had already completed I just had to clear the Dynamics..DULCK table and restart Utilities to finish the job.
All went well until the last company finished then the GP Utilities window showed blank. Checking the Task Manager revealed an odd pattern of CPU utilization by the session, indicating that the process was still running - but the blank window was odd.
As I left to come and start this blog I was interrupted regarding a report issue. As I was rendering assistance I noticed a MAJOR clue - I had folders in SSRS for all my companies. AHA!!!
I had failed to clear the SSRS setting in the Dynamics database after I had restored it following the first error. That was in my script but I'd overlooked it on the restore.
Returning to my GP Utilities session revealed the nasty GP Utilities window telling me my "Business Intelligence reports were not deployed." - Buzz off!
I've got some cleanup to do in my SSRS folders. We don't bother with GP's standard SSRS reports and this is something that's plagued me for years (none of my clients in my former life as a consultant bothered with them either.) GP insists on loading these "BI" reports if there is a reference to an SSRS instance. I think it's time for a suggestion to make this an optional step instead of a default process.
Of course the window is asking me to Retry or Cancel. I don't want to Retry but Cancel just allows it to keep going - trying to load SSRS reports I don't want. So it's "Hammer Time" on the program - End Task!
I hate to do that but they leave me no choice.
I was then able to start GP and get my first hands-on with GP 2018. I see that I've got a new step in my upgrade process right off-the-bat: get rid of "Intelligent Cloud Insights". Sorry but I don't care!
Another buzz-off!
Dynamics GP 2015R2 to 2018R2 upgrade - Installation
This is the second blog post related to our upgrade of Dynamics GP.
So once IT handed off the server I logged in and immediately installed my required components from ninite.com. (If you haven't used this service why not?) My typical load is:
7-Zip
Notepad+
PDF reader - our company standard is FoxIt, though I've used PDFCreator in the past.
Classic Shell
Apparently IT did not load Chrome so I'll be doing that also.
After that comes the SQL installation. Just a basic Engine install, but we need the Agent also. We do not update domain service accounts with each install so I stuck with that. We do allocate a separate drive for backups so I configured for that, as well the other standard choices for GP (mixed mode, DOCI, etc.)
Then comes SQL Management Studio. That is now published separate by Microsoft so I downloaded the latest version of that.
And on to the standard GP 2018R2 installer. We follow the best practice of installing the base version first, updated to the latest SP, and including all required features. And we do this without any 3rd parties or customizations.
Finally I copied over our GP company databases and restored and readied them for the upgrade. The two important steps there are to set all databases to Simple Recovery, and bring all to SQL 2016 compatibility.
Next up is the upgrade itself.
Dynamics GP 2015R2 to 2018R2 upgrade - Background
It's a new year and time for a GP upgrade!
We are payroll users here so we have to keep the code base on the latest version. Plus I needed something meaty to do. So I got the bosses ok and petitioned IT for the new server and it's off to the races!
Well we have a lot of moving parts to maybe not so fast as a race. More like a plodding timeline maybe.
So the path is from GP 2015R2 on Server and SQL 2012 to 2018R2 on Server and SQL 2016. That should keep us for a few years.
Our last upgrade was in October of 2015 from GP 2010 to 2015 R2. It went fairly well but we had some difficulties with some significant changes to our customization at the same time. This time there will be no dramatic changes to all should proceed smoothly (I hope.)
IMPORTANT: No upgrade should be started without a detailed plan. I'm lacking the prior plan - flat-out don't recall what I did with it. So new plan is on the table, or in Excel I suppose. And as I created a tab for every major element of the upgrade (i.e. OS, GP, 3rd parties, etc.) I've already got 10 tabs going!
Step 1 of any upgrade or update is create the plan!
Step 2 was to gather all the components of the new version. This includes all the GP 2018 R2 software, as well as documentation, third party software, and registration keys.
And yes we pay maintenance on all our software so there's that.
All the subsequent steps will be detailed in future blog posts.
Whooppeee!
We are payroll users here so we have to keep the code base on the latest version. Plus I needed something meaty to do. So I got the bosses ok and petitioned IT for the new server and it's off to the races!
Well we have a lot of moving parts to maybe not so fast as a race. More like a plodding timeline maybe.
So the path is from GP 2015R2 on Server and SQL 2012 to 2018R2 on Server and SQL 2016. That should keep us for a few years.
Our last upgrade was in October of 2015 from GP 2010 to 2015 R2. It went fairly well but we had some difficulties with some significant changes to our customization at the same time. This time there will be no dramatic changes to all should proceed smoothly (I hope.)
IMPORTANT: No upgrade should be started without a detailed plan. I'm lacking the prior plan - flat-out don't recall what I did with it. So new plan is on the table, or in Excel I suppose. And as I created a tab for every major element of the upgrade (i.e. OS, GP, 3rd parties, etc.) I've already got 10 tabs going!
Step 1 of any upgrade or update is create the plan!
Step 2 was to gather all the components of the new version. This includes all the GP 2018 R2 software, as well as documentation, third party software, and registration keys.
And yes we pay maintenance on all our software so there's that.
All the subsequent steps will be detailed in future blog posts.
Whooppeee!
Subscribe to:
Comments (Atom)