Page 1 of 1

check a huge dataset

PostPosted: Fri Dec 14, 2018 6:31 pm
by samb01
Hello,

i have a dataset with 9 932 248 and 1294125 tarcks
I just want to check the integrity of the dataset.
My dataset is unreadable because of this message :


IEC020I 001-3,XJOB,S30,SYSUT1-0024,2664,PXUMPY,
IEC020I X.DATASET                                                
IEC020I NON-ACCEPTABLE ERROR            
 


so before doing the copy, i'd like to check the dataset.
I did it with : ICETOOL


//S1   EXEC PGM=ICETOOL                                
//TOOLMSG DD SYSOUT=*                                  
//DFSMSG  DD SYSOUT=*                                  
//IN      DD DSN=X.DATASET,DISP=SHR      
//TOOLIN  DD *                                          
COUNT FROM(IN)                                          
/*                                                      
 


The ICETOOL abended, it means the Dataset is not correct but it took too much elapse time :

TOTAL CPU TIME= .75 TOTAL ELAPSED TIME= 22.87

I'd like to know if there a programm taht could check a Huge dataset quikly (faster than PGM=ICETOOL)

Thank's for your help.

Re: check a huge dataset

PostPosted: Fri Dec 14, 2018 6:57 pm
by enrico-sorichetti
that' s not an error that the application program can solve by itself
let Your storage support handle it

Re: check a huge dataset

PostPosted: Fri Dec 14, 2018 7:30 pm
by Robert Sample
but it took too much elapse time
Elapsed time is impacted by a large number of factors (such as the size of the data set, the number of address spaces -- batch jobs, TSO users, started tasks and possibly OMVS processes -- executing in the LPAR, their relative priorities compared to yours, how busy the CPU is, the WLM (Workload Manager) policy, contention for the channel and for the device, and so forth). Your site support group can assist in optimizing elapsed time, but in general there's not too much you can do to change it. And if your data set has millions of tracks, it is not going to execute in a short amount of elapsed time no matter what you do -- it takes a certain amount of time to read those millions of tracks and that time is an irreducible minimum! Also note that elapsed time can be drastically impacted by these factors -- I've seen jobs that normally run in 3 minutes take as much as 5 hours when submitted while the CPU is running 100% busy.

Re: check a huge dataset

PostPosted: Fri Dec 14, 2018 7:54 pm
by samb01
Hello,

i found a really simple program that can check easily the dataset : SYSGENER with a DUMMY.
It abended when the dataset is unreadable. And it takes 1 munutes elaps

TOTAL CPU TIME= .32 TOTAL ELAPSED TIME= 1.00



//SYSGEN01 EXEC PGM=SYSGENER                                    
//SYSPRINT DD SYSOUT=*                                          
//SYSUT1   DD DISP=SHR,DSN=X.DATASET            
//SYSUT2   DD DUMMY,DCB=(RECFM=VB,LRECL=27994,BLKSIZE=27998)