Page 1 of 1

SORT WORK SPACE ISSUE.

PostPosted: Fri Apr 24, 2015 12:48 am
by Vineet
Hi All,
I am having a File with Record Count = 115 Million, LRECL = 131, RECFM=FB. I want to SORT the File to eliminate duplicate records. I am Trying with Dynamic Allocation. Below is my Job Card.

//STPRDF11 EXEC PGM=SORT,PARM='FILEZ=E550000000,DYNALLOC=(SYSDA,32)',
// COND=(4,LT)

Job is failing with error "RESOURCES WERE UNAVAILABLE FOR DYNAMIC ALLOCATION OF WORK DATA SETS - Reason Code 58'. I tried using Tape for Temporary Workstorage, not sure how to do. I know It's recomended to use DASD not TAPE. Please suggest how to Sort Such a Big File.

Thanks
Kind Rgd's

Re: SORT WORK SPACE ISSUE.

PostPosted: Fri Apr 24, 2015 1:28 am
by enrico-sorichetti
the only people who can help You are the people from the storage support!

look at the storage estimates for dfsort work files and find out if there is enough dasd space for them.

a very rough estimate should be around 20000 3390 cylinders ?

to be on the safe side around 10% more than the input dataset size

according to a table found in the dfsort manuals
for a FB file with a record size up to 160 bytes
MB 40, INPUT CYLS 51,workarea cyls 56

Re: SORT WORK SPACE ISSUE.

PostPosted: Fri Apr 24, 2015 2:02 am
by BillyBoyo
Can you show a representative sample of the file, including a couple of duplicates?

Are you SORTing just to be able to use SUM FIELDS=NONE? How did the duplicates occur? Is there anything which can be used as a "key"