Duplicate Dataset



Ask about System customization & performance, Workload management, I/O device configuration etc.

Re: Duplicate Dataset

Postby v1gnesh » Wed Feb 22, 2012 6:55 pm

Its a user dataset and is available in only SMS managed volumes.
boyo
v1gnesh
 
Posts: 72
Joined: Wed Sep 28, 2011 8:24 pm
Has thanked: 1 time
Been thanked: 0 time

Re: Re: Duplicate Dataset

 

Re: Duplicate Dataset

Postby v1gnesh » Wed Feb 22, 2012 10:44 pm

The problem was recreated with new,catlg,delete in place of new,catlg,catlg. Still got an E37; but the dataset was not deleted. The second time I ran the job( 1st step is to delete the dataset ), it failed in the first step with

STEP0 OUTFILE2 - ALLOCATION FAILED DUE TO DATA FACILITY SYSTEM
IGD17001I DUPLICATE DATA SET NAME ON VOLUME TST002
FOR DATA SET RCIP00S.JDH.DUMP.PROD
IGD17001I DUPLICATE DATA SET NAME ON VOLUME TST003
FOR DATA SET RCIP00S.JDH.DUMP.PROD
IGD17001I DUPLICATE DATA SET NAME ON VOLUME TST001
FOR DATA SET RCIP00S.JDH.DUMP.PROD


How is it that the abnormal termination DISP isn't properly deleting the dataset? It remains uncatalogued on the volume. Here's what i found about the JCL ERROR in STEP0 with MOD,DELETE,DELETE.

“If the system cannot find volume information for the data set on the DD statement, in the catalog, or passed with the data set from a previous step, the system assumes that the data set is being created in this job step.”
boyo
v1gnesh
 
Posts: 72
Joined: Wed Sep 28, 2011 8:24 pm
Has thanked: 1 time
Been thanked: 0 time

Re: Duplicate Dataset

Postby expat » Wed Feb 22, 2012 11:17 pm

Which volume does the catalog point to, if anywhere for this dataset.

Have you done a 3.4 on each of the volumes listed in the error messages to see if the dataset does indeed exist on those volumes.
expat
 
Posts: 382
Joined: Sat Jun 09, 2007 3:21 pm
Has thanked: 0 time
Been thanked: 1 time

Re: Duplicate Dataset

Postby v1gnesh » Wed Feb 22, 2012 11:20 pm

expat wrote:Which volume does the catalog point to, if anywhere for this dataset.

Have you done a 3.4 on each of the volumes listed in the error messages to see if the dataset does indeed exist on those volumes.


Yes it does exist. The dataset is not catalogued at all. Its available in all 3 volumes, uncatalogued.

ABCDE.JDH.DUMP.PROD TST001
ABCDE.JDH.DUMP.PROD TST002
ABCDE.JDH.DUMP.PROD TST003
boyo
v1gnesh
 
Posts: 72
Joined: Wed Sep 28, 2011 8:24 pm
Has thanked: 1 time
Been thanked: 0 time

Re: Duplicate Dataset

Postby NicC » Thu Feb 23, 2012 1:02 am

What is your JCL for the delete step?
The problem I have is that people can explain things quickly but I can only comprehend slowly.
Regards
Nic
NicC
Global moderator
 
Posts: 2690
Joined: Sun Jul 04, 2010 12:13 am
Location: Pushing up the daisys (almost)
Has thanked: 4 times
Been thanked: 105 times

Re: Duplicate Dataset

Postby v1gnesh » Thu Feb 23, 2012 1:36 am

NicC wrote:What is your JCL for the delete step?


Its there in the first post in this thread.
boyo
v1gnesh
 
Posts: 72
Joined: Wed Sep 28, 2011 8:24 pm
Has thanked: 1 time
Been thanked: 0 time

Re: Duplicate Dataset

Postby NicC » Thu Feb 23, 2012 12:28 pm

Whoops!
What I do see is that in the delete youa re using UNIT=DISK and in your create step you are using UNIT=SYSDA. These may, or may not, point to the same set of dasd.

Oh, and try using SPACE=(TRK,0) in your delete step. You are deleting a file therefore you do not need any space allocated. Won't affect your problem but it is just neater.
The problem I have is that people can explain things quickly but I can only comprehend slowly.
Regards
Nic
NicC
Global moderator
 
Posts: 2690
Joined: Sun Jul 04, 2010 12:13 am
Location: Pushing up the daisys (almost)
Has thanked: 4 times
Been thanked: 105 times

Re: Duplicate Dataset

Postby expat » Thu Feb 23, 2012 12:55 pm

I suggest that you use IDCAMS for deleting datasets, far more reliable than IEFBR14 if your site still uses esoteric DASD pooling, as pointed out by Nic, the unit does differ between the delete and create steps.

Also, you will need to use IDCAMS again to delete the NVR from each of the volumes on which the rogue dataset resides. If the volumes are non SMS managed, then perhaps IEHPROGM may be required.
expat
 
Posts: 382
Joined: Sat Jun 09, 2007 3:21 pm
Has thanked: 0 time
Been thanked: 1 time

Re: Duplicate Dataset

Postby v1gnesh » Thu Feb 23, 2012 11:22 pm

NicC wrote:Whoops!
What I do see is that in the delete youa re using UNIT=DISK and in your create step you are using UNIT=SYSDA. These may, or may not, point to the same set of dasd.

Oh, and try using SPACE=(TRK,0) in your delete step. You are deleting a file therefore you do not need any space allocated. Won't affect your problem but it is just neater.


We haven't had this problem before, and the same JCL has been in use for years. About the SPACE parm, I completely agree. But the client doesn't like being corrected. :(
boyo
v1gnesh
 
Posts: 72
Joined: Wed Sep 28, 2011 8:24 pm
Has thanked: 1 time
Been thanked: 0 time

Re: Duplicate Dataset

Postby v1gnesh » Thu Feb 23, 2012 11:26 pm

expat wrote:I suggest that you use IDCAMS for deleting datasets, far more reliable than IEFBR14 if your site still uses esoteric DASD pooling, as pointed out by Nic, the unit does differ between the delete and create steps.

Also, you will need to use IDCAMS again to delete the NVR from each of the volumes on which the rogue dataset resides. If the volumes are non SMS managed, then perhaps IEHPROGM may be required.


The volumes are SMS managed. I will ask the client to use IDCAMS to delete the dataset and the NVR entries next time.
Do you think that such a resident NVR entry from before has caused the E37-08, and subsequently the JCL ERROR while deleting the dataset.. ?
boyo
v1gnesh
 
Posts: 72
Joined: Wed Sep 28, 2011 8:24 pm
Has thanked: 1 time
Been thanked: 0 time

PreviousNext

Return to System programming

 


  • Related topics
    Replies
    Views
    Last post