Automate Unit Account Values Based on Sales Quantities

It is a typical KPI request to be able to report on unit account values.  Let's go ahead and say that we're looking at a profit and loss report, by product and the business wants to see units sold that are included in that margin.

In a perfect world (and using some tools that are not named Management Reporter), if you're capturing the Product A quantity in the Sales Order Processing (SOP) module in GP, you'd be able to reference the sum of the sales instead of using a unit account.  However... We don't live in a perfect world whilst using Management Reporter thus Unit Accounts are required.

So what we'd see is in GP, we'd setup a unit account for something like "Units Sold - Product A"

In Management Reporter, we'd then have a row definition that looked something like...

Sales Product A

- Expenses Product A

= Margin

/ Units Sold - Product A

= Margin/Unit

In a typical fashion, in order for this report to be up to date, the accounting team would need to make a routine journal entry to the unit account within GP.  This process leaves way for manual errors, timing issues & the potential for management to lose confidence in their reports.

Using the method below, we complete streamline this process by automating the unit account journal entry within GP.

Part 1: Associate a unit account with an inventory item in GP.

Note: For example deliverable, one item may be used at multiple locations (segment 1 of COA).  Therefore, we just assign the Natural account to the item & then let the integration define the remaining COA segments)

Approach: We accomplish this using

eOne Solution's Extender product

Part 2: Setup integration that captures posted activity in GP & creates journal entry in GP.

Note: For example deliverable, we're pulling the prior day's activity & creating a journal entry first thing in the morning.

Note: Integration script defines COA segments based on the SITE of the sale on SOP invoice & then the natural account that we associated to the item.

Approach:  We accomplish this using

eOne Solution's SmartConnect product

Map/Integration Setup

ODBC Connection to Company Database - Selecting Prior Day Sales

Part 3: Enable map/integration schedule.  This is an out of the box functionality of SmartConnect.

Part 4: Create reconciling reports using SmartList/SmartView.  These reports can be used at month-end to reconcile financials.

Note: This report is best suited to be built using a Sales Line Item report, filter on the posting date & ensures that returns are shown as negatives

Part 5: Notify accounting team that all they need to do moving forward is post their batches & reconcile once @ month-end!

Integrated transactions in GP would look similar to...

Note: All prior day activity are rolled into 1 line.  Option to separate for each sales transaction would be available if desired.

A few other notes to consider in this process...

- Returns must be accounted for

- When deploying, it is recommended to have the accounting team reconcile more frequent.  As they gather an understanding of process and ensure that numbers are tying out, reconcile less frequent.

- Failed integrations should auto-notify an integration PowerUser.  PowerUser could re-run integration if/when needed

This solution is built in a manner that's easy to implement & maintain and quickly accomplishes the goals of time savings, accurate data & providing a more robust GP system.

Thinking about implementing for your business and looking for additional details?  Already implemented and have feedback?  Let us know in the comments below!

Company Address IDs - Import

Did you know that within Purchase Order Processing that the list of available ship to addresses pulled from the Company Address ID setup within GP?  During a POP/Inventory/SOP implementation, it's no uncommon for companies to have 100s of potential ship to addresses for purchases.  Below we've covered our favorite way to get this list of address IDs into the system using .csv/.txt files and SQL.

Step 1.)  Create a spreadsheet that list all address lines.  Our example contains the following columns.

Company ID, Address ID, Name, Contact, Address 1, City, State, Zip Code

Step 2.) Save spreadsheet to a tab-delimited text file

Step 3.) Open the text file & replace any " characters w/ {blanks}.  " characters will appear when a field contains a comma.

Step 4.) Import into SQL using SQL Server Management Studio

>>Right-Click Database

    >>Select Tasks

        >>Select Import Data

           >>Select Flat File as source

              >>Enter Staging Table Name in SQL Destination (Use name of CoAddressID_Template in our example below)

Step 5.) Create backup of table that we're about to insert into (select * into sy00600_backup{date} from sy00600.  This step is important in case we need to roll-back our insert and don't want to have to refresh the entire database.

Step 6.) Ensure that no address IDs in your staging table already exist within GP.

select * from sy00600 where locatnid in (select [Address ID] from coAddressid_template)

Step 7.) Run SQL Insert statement to move data from staging table (CoAddressID_template) to SY00600 table.

declare @cmpanyid numeric(2,0);

set @cmpanyid = (select cmpanyid from dynamics..sy01500 where cmpnynam = '{ENTER YOUR CO NAME}');

insert into gpent..sy00600 (CMPANYID,LOCATNID,LOCATNNM,ADRSCODE,ADRCNTCT,ADDRESS1,ADDRESS2,ADDRESS3,CITY,COUNTY,STATE,ZIPCODE,COUNTRY,PHONE1,PHONE2,PHONE3,FAXNUMBR,CHANGEBY_I,CHANGEDATE_I,CCode)

select @cmpanyid,left([Address ID],15),left([Name],30),'',left([Contact],60),left([Address 1],60),'','',left([City],25),

'',left([State],30),left([ZIP Code],10),'','','','','','','1900-01-01 00:00:00.000','' from coAddressid_template

That should do it.  You can add additional columns to your sourcefile if you please.  Just be sure to update your insert script to account for the new columns & remember character limitations! left(Address1,60)

Our first preference is to always use the Microsoft eConnect procedures or integration tools like Integration Manager when available.  However, when all else fails, a carefully executed SQL script can achieve the same outcome.  At the end of the day, it's all data in a database.

Let us know if this works for you or if you'd like to see similar processes using SQL!