...
4. Executing the Scrubber and Poster Jobs: Processes the origin transactions and updates balance tables.
?
Step 1: CreateFeeder Create Feeder Files from the Source System
Generate two flat files from your feeder system: data file and reconciliation file. The reconciliation file serves as abatch a batch control file to validate the number of transactions and total dollars contained in your file. You should name the data file filenname.dataand filename.recon.
...
Note that the Encumbrance Update Code (D=Document, R=Reference) is required for encumbrance transactions. When the encumbrance update code is "R" (disencumbrancedis-encumbrance), additional three fields to reference the encumbering transactions are required.
...
The standard format of the reconciliation file is:
C tableid rowcount row count ;
S field1 dollaramount dollar amount ;
S field2 dollaramount dollar amount ;
E checksum ;
- A 'C' 'S' or 'E' must be the first character on a line unless the line is entirely whitespace white-space or a comment. The case of these three codes is not significant.
- Semi-colons are required before any possible comments on C S or E lines. Any amount of whitespace white-space delimits the elements of C, S and E lines.
- Rowcount Row count must be a non-negative integer.
- Fieldn is the technical fieldname(s) in the target database. -- Case *is* significant, since this must match the databasename(s) exactly.
- Dollaramount Dollar amount may be negative; the check is significantto significant to 4 decimal places: the origin entry transaction amounts are absolute amounts.
- The checksum check-sum on line E is the number of C and S lines.
- A C line and a terminating E line are mandatory; Slines S lines are optional.
- There may be more than one C-E block per metadata meta data file.
The example of the recon file:
...
Select the files you have created, enter the File Set Identifier, and click add. The File Set Identifier must be unique for each upload.
When you click add, the Enterprise Feeder Batch Upload Process saves the files in the staging directory
Warning: Unless you have modified the external.config.directory property or logs.directory property, you should be able to find your logs in /opt/logs/dev/kuali. This page and the build.properties file should help you find this type of thing: https://test.kuali.org/confluence/display/KULDOC/Configuration+Properties+2 |
...
The files in the staging directory can be viewed from the Administration Tab à -> Batch File menu. Search for the files in the staging/enterpriseFeed directoryenterprise feed directory.
?Warning: If you exceed the file size, you will experience the file upload limit error. In this case, you can FTP the files to staging directly. Make sure to create the done file manually.
Step 3: Execute enterpriseFeedJob
From the Administration Tab -> Schedule menu, execute the enterpriseFeedJob.
To manually execute the job, modify the unscheduled job, and click run.
Verify that the job has run successfully and review the log file. Be aware that "Succeeded" does not always mean your data file has been accepted correctly – it only means that the job has completed. You need to verify the log file to determine if the rows were inserted into the Origin Table.
Staging Directory after the Feed
When the enterpriseFeedJobruns successfully, the done file is removed from the Staging directory. The File Set without the done file will not be processed when you submit the enterpriseFeedJob next time.
Step 4: Execute Scrubber and Poster
Just as any other batch processes, execute the Scrubber and Poster Jobs to process the origin transactions and update the GL Balance tables.