Work Files: Searching and Appending Records

Software AG's platform-independent programming language with full support for open-source and Internet applications

Work Files: Searching and Appending Records

Postby Eka1618 » Fri May 20, 2011 10:58 pm


I am working on a project where I need to retain records from an ADABAS file that will eventually be purged from the database. These records will need to be stored into a flat file.

The proposed flat file will need to contain two types of records
1) General Account Info
2) Account Payment transactions

This is because the current ADABAS file stores ONE account that could have MANY payment transactions. Below is a sample of the layout:


The work file will need to be searched to determine if an account from the AR-INACTIVATION file already exists in it or not. If it does not exist, I will need to append that account's record information from the AR-INACTIVATION file into the work file.

I was thinking of creating some type of user defined array to READ all of the existing records of the work file into and then use the array to search for account records. I thought of creating the process this way because I am not sure if it is efficient to continuously read a work file in a program to search for a value. However, since the work file will accumulate over time, I am not sure if this will work. Are there limits as to how many records can be stored in an array and a work file?

There is a possibility that we will be changing the purging process in the near future so that records scheduled for purging from this file will be stored into a new ADABAS file that will never be purged. However, if I can come up with a solution to use for atleast the next year, that would be great.

Thank you for your help in advance!

User avatar
Posts: 5
Joined: Wed Nov 17, 2010 8:18 pm
Location: New York, USA
Has thanked: 0 time
Been thanked: 0 time

Re: Work Files: Searching and Appending Records

Postby RGZbrog » Sat May 21, 2011 4:26 am

Are there limits as to how many records can be stored in an array and a work file?

To answer your questions:
2. Natural imposes no limit to the number of records read from or written to a WORK file. The operating system imposes limits, but these are so big as not to be a concern.

1. Natural limits your local data area to 1G, so, depending on how much memory your program uses now, this table can be pretty big.

Your idea to read the WORK file once and create an internal table is much better than re-reading the WORK file from the beginning each time you want to check an AR-INACTIVATION (IO-intensive), but to EXAMINE a table of ACCOUNT-NUM values will be CPU-intensive. I would implement this solution only if the WORK file is always very small and your program runs infrequently.

An alternative is to write the purge records to a second WORK file regardless of their existence on the first WORK file. Then use the sort utility to combine the old and new files, deleting any new records that are duplicates. To do this use

SORT FIELDS= to define the ACCOUNT-NUM field as the key
SUM FIELDS=NONE to eliminate duplicates
NOEQUALS to ensure that the "old" version of the duplicate is retained instead of the "new" version, if that's important

The sort utility is so fast that it won't impact your CPU or elapsed time.
User avatar
Posts: 101
Joined: Mon Nov 23, 2009 1:34 pm
Location: California, USA
Has thanked: 0 time
Been thanked: 0 time

Return to Natural


  • Related topics
    Last post