And why on earth would such a weird disposition be used anyway -- pass the data set to the next step if this step runs normally but catalog the data set if this step abends?
A few times i've seen this used (in testing) so that the content of a "new" dataset can be seen when the process abends. If the step ran correctly the dataset was "passed" and disposed of later in the test job. More often it was (new, catlg, catlg) for testing and then (new, catlg, delete) when the testing was pretty well finished.
Once upon a time (back in the day of mft/mvt) i seem recall minor wars
about the overhead of cataloging versus passing a dataset . . .
My practice has been for a very long time is to only pass
datasets with dsn=&&xxxx.tttt.uuu
fwiw. . .