...
The KFS Enterprise Feeder Batch File Upload allows you to upload flat file transactions from the external feeder systems. The uploaded files are processed by the enterprise feeder job, which reconciles the totals as the minimum validation. The Scrubber and Poster jobs then process the accepted files to update the GL balances. Unlike the Collector Batch Upload, the Enterprise Feeder Batch Upload process is designed for a trusted source, processing the files faster and performing the minimum validation. They are most likely used by external feeder systems such as payroll and cashiering systems. |
Requirements
- The Accounting office must approve of all file feeds to KFS before such files are sent to the production environment. Please contact Accounting and include them in the verification of your data uploads as part of the process.
Overview
The Enterprise Feeder Batch process involves the following four steps:
...
To use SFTP, instructions are found on the page KFS SFTP Instructions To Upload and Download Files.
Or to use the delivered interface from the Administration -> Enterprise Feed Upload menu. Then follow the instructions below.
Select the files you have created, enter the File Set Identifier, and click add. The File Set Identifier must be unique for each upload.
When you click add, the Enterprise Feeder Batch Upload Process saves the files in the staging directory
|
File Set Identifier
Upon successful upload, the staging directory will have three files identified by the File Set Identifier.
File | Physical Name | Explanation |
Done File | entpBatchFile_UserName_FileSetID.done | 0 byte file created by the Enterprise Feeder Batch Upload process |
Data File | entpBatchFile_UserName_FileSetID.data | Your data file is uploaded and renamed |
Reconciliation File | entpBatchFile_UserName_FileSetID.recon | Your reconciliation file is uploaded and renamed |
The files in the staging directory can be viewed from the Administration Tab -> Batch File menu. Search for the files in the staging/enterprise feed directory.
Warning: If you exceed the file size, you will experience the file upload limit error. In this case, you can FTP the files to staging directly. Make sure to create the done file manually.
Step 3: Execute enterpriseFeedJob
From the Administration Tab -> Schedule menu, execute the enterpriseFeedJob.
To manually execute the job, modify the unscheduled job, and click run.
Verify that the job has run successfully and review the log file. Be aware that “Succeeded” does not always mean your data file has been accepted correctly – it only means that the job has completed. You need to verify the log file to determine if the rows were inserted into the Origin Table.
Staging Directory after the Feed
When the enterpriseFeedJobruns successfully, the done file is removed from the Staging directory. The File Set without the done file will not be processed when you submit the enterpriseFeedJob next time.
Step 4: Execute Scrubber and Poster
Just as any other batch processes, execute the Scrubber and Poster Jobs to process the origin transactions and update the GL Balance tables.