Page 1 of 1

Multiple remote job entry writing to the same dataset

Posted: Wed Nov 15, 2023 3:55 am
by MLK2348
I am generating multiple jobs with Natural remote job entry using an input dataset with a list of 300 objects. Each object is used in its own job and data is written to a single dataset on tape.

I want for each job to write to the same dataset. Do I need to insert time delays between each job submission to ensure the dataset will close? If I write to a DASD dataset with disp=(mod,catlg), do I have to know the total size to allocate at the time of creation? The amount of data will vary depending on the time of year so total size is hard to estimate.

Mel

Re: Multiple remote job entry writing to the same dataset

Posted: Wed Nov 15, 2023 1:50 pm
by willy jensen
DISP=MOD will allocate with exclusive access, so job2 will wait for job 1 to finish, before it gains access to the dataset. CTLG is not neccssary unless the dataset already exist and is uncataloged.
For fluctuating space requirements, you should look at at reasonable small primary allocation and a somewhat bigger secondary.