Fixed Assets - Upgraded Projection Reports Using SSRS

Quick show-off of how to utilize SSRS to deploy improved Fixed Asset Projection reports in Dynamics GP 2018. Versions are a month over month report as well as a summary/detailed projection listing with G/L account.

Read More

SmartList Builder - Develop & Deploy Reports

eOne Solutions' SmartList Builder is the equivalent of the SmartList Designer product that Microsoft now offers with the out of the box Dynamics install.  SmartList Designer has been around a little longer thus was the focus of our post.  Everything that's done in this particular video can be accomplished using SmartList designed in almost a similar sequence.

Please check out our video post on Youtube and <SUBSCRIBE> to our channel for notifications on future posts.

Also, if you have feedback on our videos or have a topic in mind that you'd like for us to cover, please let us know in the comments below!

Automate Unit Account Values Based on Sales Quantities

It is a typical KPI request to be able to report on unit account values.  Let's go ahead and say that we're looking at a profit and loss report, by product and the business wants to see units sold that are included in that margin.

In a perfect world (and using some tools that are not named Management Reporter), if you're capturing the Product A quantity in the Sales Order Processing (SOP) module in GP, you'd be able to reference the sum of the sales instead of using a unit account.  However... We don't live in a perfect world whilst using Management Reporter thus Unit Accounts are required.

So what we'd see is in GP, we'd setup a unit account for something like "Units Sold - Product A"

In Management Reporter, we'd then have a row definition that looked something like...

Sales Product A

- Expenses Product A

= Margin

/ Units Sold - Product A

= Margin/Unit

In a typical fashion, in order for this report to be up to date, the accounting team would need to make a routine journal entry to the unit account within GP.  This process leaves way for manual errors, timing issues & the potential for management to lose confidence in their reports.

Using the method below, we complete streamline this process by automating the unit account journal entry within GP.

Part 1: Associate a unit account with an inventory item in GP.

Note: For example deliverable, one item may be used at multiple locations (segment 1 of COA).  Therefore, we just assign the Natural account to the item & then let the integration define the remaining COA segments)

Approach: We accomplish this using

eOne Solution's Extender product

Part 2: Setup integration that captures posted activity in GP & creates journal entry in GP.

Note: For example deliverable, we're pulling the prior day's activity & creating a journal entry first thing in the morning.

Note: Integration script defines COA segments based on the SITE of the sale on SOP invoice & then the natural account that we associated to the item.

Approach:  We accomplish this using

eOne Solution's SmartConnect product

Map/Integration Setup

ODBC Connection to Company Database - Selecting Prior Day Sales

Part 3: Enable map/integration schedule.  This is an out of the box functionality of SmartConnect.

Part 4: Create reconciling reports using SmartList/SmartView.  These reports can be used at month-end to reconcile financials.

Note: This report is best suited to be built using a Sales Line Item report, filter on the posting date & ensures that returns are shown as negatives

Part 5: Notify accounting team that all they need to do moving forward is post their batches & reconcile once @ month-end!

Integrated transactions in GP would look similar to...

Note: All prior day activity are rolled into 1 line.  Option to separate for each sales transaction would be available if desired.

A few other notes to consider in this process...

- Returns must be accounted for

- When deploying, it is recommended to have the accounting team reconcile more frequent.  As they gather an understanding of process and ensure that numbers are tying out, reconcile less frequent.

- Failed integrations should auto-notify an integration PowerUser.  PowerUser could re-run integration if/when needed

This solution is built in a manner that's easy to implement & maintain and quickly accomplishes the goals of time savings, accurate data & providing a more robust GP system.

Thinking about implementing for your business and looking for additional details?  Already implemented and have feedback?  Let us know in the comments below!

Copying Quick Links from One User to Another

Microsoft has come a decent ways to allow each user to customize their home login page.  For a long time there has been the feature for the first time that a user connects, they are prompted to select their "Role" that determines how their out of the box homepage will look & feel. 

With later versions of GP, there also came the ability to copy a user's homepage settings from 1 user to another (along with every other setting from the original user).

However, Microsoft still hasn't made it simple just to take an existing user's quick links and apply them to another user.  From using the interface, you'd have to copy ALL user settings from one user to another using the "Users Maintenance" window OR you'd have to login to a certain user's account and set their quick links OR provide them with instructions on setting them themselves. 

NO MORE! 

Using the script below, you can define a copy from user (TestUser1) & a copy to user (TestUser2) and either replace the existing quicklinks or add to.

--Declare Variables
DECLARE @CopyFromUserID CHAR(15), @CopyToUserID CHAR(15), @Count INT, @Delete CHAR(1)

 --Provide Values for Variables
--This is the section that MUST BE UPDATE
SELECT @CopyFromUserID = 'TestUser1', @CopyToUserID = 'TestUser2', @Delete = 'N' --Delete = Y means delete any existing quicklinks for copy to user, N = Append Only

--This script will delete existing favorites if @Delete = Y
DELETE FROM dbo.SY08140 WHERE userid = @CopyToUserID AND @Delete = 'Y'

--Start copy from one user to another
SELECT * INTO #TempQuickLinks FROM sy08140 WHERE userid = @CopyFromUserID AND dsplname NOT IN (SELECT DSPLNAME FROM dbo.SY08140 WHERE userid = @CopyToUserID) AND @CopyToUserID IN (SELECT userid FROM SY01400)

SET @Count = 1

WHILE @Count <= (SELECT COUNT(seqnumbr) FROM #TempQuickLinks)

begin

UPDATE #TempQuickLinks SET SEQNUMBR = ISNULL((SELECT MAX(seqnumbr) + 1 FROM sy08140 WHERE USERID = @CopyToUserID),1)

INSERT INTO SY08140 (USERID, SEQNUMBR, TYPEID, CmdID, CmdFormID, CmdDictID, DSPLNAME, ScbTargetStringOne, ScbTargetStringTwo, ScbTargetStringThree, ScbTargetLongOne

,ScbTargetLongTwo, ScbTargetLongThree, ScbTargetLongFour, ScbTargetLongFive)

SELECT TOP 1 @CopyToUserID, SEQNUMBR, TYPEID, CmdID, CmdFormID, CmdDictID, DSPLNAME, ScbTargetStringOne, ScbTargetStringTwo, ScbTargetStringThree, ScbTargetLongOne

,ScbTargetLongTwo, ScbTargetLongThree, ScbTargetLongFour, ScbTargetLongFive FROM #TempQuickLinks WHERE DSPLNAME NOT IN (SELECT dsplname FROM sy08140 WHERE USERID = @CopyToUserID)

SET @Count = @Count + 1

END

DROP TABLE #TempQuickLinks
--End copy from one user to another

--View ending results
SELECT * FROM dbo.SY08140 WHERE userid = @CopyToUserID

Note: The script has only been used in limited GP 2015 R2 environments.  As always, it is recommended to run the script in a development environment before deploying to production.

This script might be useful when...
- Setting up a new user and wanting to setup their quicklinks based off another user in a similar role
- Setting up a new module and wanting to add quicklinks to all users based on your model user (say when setting up Fixed Assets & wanting to add an external link to the new GAAP Financial Leases requirements)
- Setting up a new SmartView favorite for certain users but also wanting to add the quicklink to their homepage

Have feedback or have you expanded the user of this script?  Let us know!

Company Address IDs - Import

Did you know that within Purchase Order Processing that the list of available ship to addresses pulled from the Company Address ID setup within GP?  During a POP/Inventory/SOP implementation, it's no uncommon for companies to have 100s of potential ship to addresses for purchases.  Below we've covered our favorite way to get this list of address IDs into the system using .csv/.txt files and SQL.

Step 1.)  Create a spreadsheet that list all address lines.  Our example contains the following columns.

Company ID, Address ID, Name, Contact, Address 1, City, State, Zip Code

Step 2.) Save spreadsheet to a tab-delimited text file

Step 3.) Open the text file & replace any " characters w/ {blanks}.  " characters will appear when a field contains a comma.

Step 4.) Import into SQL using SQL Server Management Studio

>>Right-Click Database

    >>Select Tasks

        >>Select Import Data

           >>Select Flat File as source

              >>Enter Staging Table Name in SQL Destination (Use name of CoAddressID_Template in our example below)

Step 5.) Create backup of table that we're about to insert into (select * into sy00600_backup{date} from sy00600.  This step is important in case we need to roll-back our insert and don't want to have to refresh the entire database.

Step 6.) Ensure that no address IDs in your staging table already exist within GP.

select * from sy00600 where locatnid in (select [Address ID] from coAddressid_template)

Step 7.) Run SQL Insert statement to move data from staging table (CoAddressID_template) to SY00600 table.

declare @cmpanyid numeric(2,0);

set @cmpanyid = (select cmpanyid from dynamics..sy01500 where cmpnynam = '{ENTER YOUR CO NAME}');

insert into gpent..sy00600 (CMPANYID,LOCATNID,LOCATNNM,ADRSCODE,ADRCNTCT,ADDRESS1,ADDRESS2,ADDRESS3,CITY,COUNTY,STATE,ZIPCODE,COUNTRY,PHONE1,PHONE2,PHONE3,FAXNUMBR,CHANGEBY_I,CHANGEDATE_I,CCode)

select @cmpanyid,left([Address ID],15),left([Name],30),'',left([Contact],60),left([Address 1],60),'','',left([City],25),

'',left([State],30),left([ZIP Code],10),'','','','','','','1900-01-01 00:00:00.000','' from coAddressid_template

That should do it.  You can add additional columns to your sourcefile if you please.  Just be sure to update your insert script to account for the new columns & remember character limitations! left(Address1,60)

Our first preference is to always use the Microsoft eConnect procedures or integration tools like Integration Manager when available.  However, when all else fails, a carefully executed SQL script can achieve the same outcome.  At the end of the day, it's all data in a database.

Let us know if this works for you or if you'd like to see similar processes using SQL!

Re-Enable New Account Segment Notification in Dynamics GP

If you've ever integrated new accounts via Integration Manager, you've probably found that the notification that prompts users that a Chart of Account segment does not exists and that prevents you being able to add accounts easily.  The easiest and quickest solution for this problem is to quickly select 'Yes, add segment' and 'Do not display this message again'.  No harm no foul.

The downside of this is that the notification window exists for a reason and that reason is to ensure that users are not accidentally adding unnecessary COA segments.  Once you've selected 'Do not notify me again', GP listens and will no longer notify that user.

However, via SQL, we can reverse these effects and force GP to forget what we told it earlier.

Step 1. View users that currently have disabled this notification window

- Select * from {CompID}..SY01401 where coDefaultType = 13

Step 2. Create database backup or table backup

- Select * into SY01401_backup_MMddYY from SY01401

Step 3. Run delete statement

- This delete statement removes the records that suppress the notification window from appearing

- Delete from (CompID}..SY01401 where coDefaultType = 13

- To filter for one specific user, add the filter "and UserID = {User}"

Step 4. Run select to confirm records have been removed

- Select * from {CompID}..SY01401 where coDefaultType = 13

Step 5. Add a non-existent COA segment in GP & ensure the prompt window appears

Seems minor but notifications like these are valuable in ensuring account structures remain clean in consistent in an environment.

Simple solution to a simple problem?  Let us know in the comments below.

Automatically Running the Receivables Aging Routine via SQL - Dynamics GP

The manual aging routine can be a controversial topic when dealing with Dynamics GP.  For some businesses, seeing aged values at a certain period in time (say end of week, end of month, end of year), is all that's needed for receivables management.  For other businesses, there may be a demand to have more up to date receivable balances thus requiring the aging routine to be run on a more frequent basis. 

With a system like Microsoft Dynamics GP where the customer base is so diverse, having this sort of flexibility is key and makes sense from a development perspective.  However... it does create a rather tedious process of having to run the aging process each time the receivables team deems they'd like to see up to date summary balances by period.

Until Microsoft releases the ability to auto-age on a pre-determined schedule (fingers crossed for GP 2016!), the following script can be used to run the aging process directly through SQL.  The beauty of having a script is that this routine can be scheduled to run on a nightly basis thus never having to manually run again!

Please note that this stored procedure is designed to run in

GP2015R2

.  Table/Stored procedures can change between versions thus requiring the script to be updated.  If you're looking for another version of the script, please contact us or let us know in the comments below.

/*
Automatic Running of Aging Routine
--Includes Aging of Finance Charges
--Deployed Date yyMMdd
--Runs every x hours, days, months, etc
*/

DECLARE @O_iErrorState int, @I_dAgingDate datetime

select @I_dAgingDate = convert (varchar(10),GetDate(),102)

EXEC dbo.rmAgeCustomer 0,'','þþþþþþþþþþþþþþþ',@I_dAgingDate,127,1,0,'',@O_iErrorState OUT

/*
Automatic Running of Aging Routine
--Excludes Aging of Finance Charges
--Deployed Date yyMMdd
--Runs every x hours, days, months, etc
*/

DECLARE @O_iErrorState int,@I_dAgingDate datetime

select @I_dAgingDate = convert(varchar(10),GetDate(),102)

EXEC dbo.rmAgeCustomer 0,'','þþþþþþþþþþþþþþþ',@I_dAgingDate,127,0,0,'', @O_iErrorState

OUT

After running the script, you can open the aging window in GP to view your last run and also view the customer summary window to see the "Aged as of date".

Quick, efficient, and manageable solutions that make everyone's live easier are our favorite.  Running the auto-aging routine via SQL when wanting to see up to date balances is a no-brainer.  Just be sure that this routine is documented so that it doesn't get forgotten in future upgrades or server changes.

Check back soon for information on more simple solutions that we'll be discussing!

GP Security: Identifying Security Roles, Tasks, & Operations

A common request we receive is: "We want to tie down security of our users but have no clue as to who has access to what".  To be frank, the reporting for this in GP is not exactly easy to come by.  While there are a number of ISVs available that will provide you with tools and analytics for this information, we have a quick fan-favorite solution via SQL to gather this data.

This information can also be found on the Microsoft Developer Blog here.  

First thing's first, in order to get the following script to populate data, the Clear Data Procedure on the Security Resource Descriptions table within GP.  This can be done from Microsoft Dynamics GP>>Maintenance>>Clear Data

Select 'Display' in the toolbar change to 'Physical', Series = 'System' & Table = Security Resource Descriptions.

After inserting the table and running the clear data procedure, the SQL query will now populate the users>>roles>>tasks>>operations which can be used as a starting point to identify and cleanup security/internal control concerns.

SELECT DISTINCT

    S.USERID UserID, S.CMPANYID CompanyID,

    C.CMPNYNAM CompanyName, S.SecurityRoleID,

    coalesce(T.SECURITYTASKID,'') SecurityTaskID,

    coalesce(TM.SECURITYTASKNAME,'') SecurityTaskName,

    coalesce(TM.SECURITYTASKDESC,'') SecurityTaskDescription,

    coalesce(R.DICTID,'') DictionaryID,

    coalesce(R.PRODNAME,'') ProductName,

    coalesce(R.TYPESTR,'') ResourceType,

    coalesce(R.DSPLNAME,'') ResourceDisplayName,

    coalesce(R.RESTECHNAME,'') ResourceTechnicalName,

    coalesce(R.Series_Name,'') ResourceSeries

FROM    SY10500 S   -- security assignment user role

LEFT OUTER JOIN

    SY01500 C   -- company master

    ON S.CMPANYID = C.CMPANYID

LEFT OUTER JOIN

    SY10600 T  -- tasks in roles

    ON S.SECURITYROLEID = T.SECURITYROLEID 

LEFT OUTER JOIN

    SY09000 TM  -- tasks master

    ON T.SECURITYTASKID = TM.SECURITYTASKID 

LEFT OUTER JOIN

    SY10700 O  -- operations in tasks

    ON T.SECURITYTASKID = O.SECURITYTASKID 

LEFT OUTER JOIN

    SY09400 R  -- resource descriptions 

    ON R.DICTID = O.DICTID AND O.SECRESTYPE = R.SECRESTYPE

    AND O.SECURITYID = R.SECURITYID

Without the Clear Data procedure ahead of time, the resource descriptions table would be empty and provide no detail on the security operations.

Security is never fun and sometimes hard to grasp your arms around.  We hope this report helps.  Please leave us any questions or feedback below & be sure to our out website @ ALaCarteGP.com.

Until Next Time!

Management Reporter Performance Issues --> Clearing Report Repository

We've run into multiple occasions where we've seen environments with top-notch resources, up to date servers and not an overly abundant amount of data yet they still seem to experience performance issues while generating reports.

The report itself would take 30+ seconds to generate but then the opening of the MR Report Viewer application would drag on and one and sometimes take upwards of a minute to open and display the generate report.  When the Report Viewer application opens, it calls the reports in the report library (to an extent) and typically isn't a big deal.  However, when you are working in an environment where Management Reporter reports are heavily relied on, the number of reports in the repository builds quickly.  You'll also find that there is no option within Management Reporter or the Report Viewer (as of CU14) to clear the repository reports or set any amount of time to store the historical reports before clearing.  This is where a bit of SQL magic comes in handy.

The following query defines an integer that is used to define how many days worth of reports that you'd like to keep in your repository.  In our example, we decided that only 10 days worth were necessary and if any reports past 10 days were ever needed that we'd simply re-generate the report.

Please note that this query may take an extended period of time to run, especially when being used as a first time.  We also highly recommend running this in a Development environment to ensure the desired outcome is what's expected before moving into production.

DECLARE @DaysBack integer

Set @DaysBack = 10


delete reporting.reportlinetransaction

where ReportID in (select id from reporting.report where RepositoryID in

(select id from reporting.repository where StatusType = 0 and [Type] = 10 and CreateDate <= DATEADD(day,@DaysBack,getdate())))


delete reporting.ReportLineFinancial

where reportid in (select id from reporting.report where RepositoryID in (select id

from reporting.Repository where StatusType = 0 and [Type] = 10 and CreateDate <= dateadd(day,@DaysBack,getdate())))


delete reporting.ReportLineAccount

where reportid in (select id from reporting.report where RepositoryID in (select id from reporting.Repository where

StatusType = 0 and [Type] = 10 and CreateDate <= dateadd(day,@DaysBack,getdate())))

delete reporting.report where RepositoryID in (select id from reporting.Repository where StatusType = 0

and [Type] = 10 and CreateDate <= DATEADD(day,@DaysBack,getdate()))


delete reporting.repository

where statustype = 0 and [Type] = 10 and CreateDate <=DATEADD(day,@DaysBack,getdate())

After running this query, you should find that when generating your next report that Report Viewer will open up in a fraction of a time as before.

That's all for today everyone.  Please feel free to leave any comments or questions below and check out ALaCarteGP.com.