SVC Dump



Ask queries about other IBM Tools like Tivoli, COBTEST, Fault Analyzer, z/OS File Manager, Workload Simulator, APA, SCLM, Merge & Migration Tools etc...

SVC Dump

Postby Antonyraj85 » Mon Jun 14, 2010 10:17 am

Dear Friends,

52 IEA793A NO DUMP DATA SETS AVAILABLE FOR DUMPID=039 BY JOB (DB21MSTR).
USE THE DUMPDS COMMAND OR REPLY D TO DELETE THE DUMP

The above is the console msg we got. we have activated auto dump allocate command in commandxx member. But it is trying to allocate more than 9000m size. So system not able to allocate. Any way, I can alter ACS routine too, that is the next option.Is it possible to capture the dump content into the nomal DASD dataset(Manually created dataset with 9000 MB size) right now by the reply message 52.


Anyone, Can you please reply me back.

Thanks in Advance !.
Antonyraj85
 
Posts: 79
Joined: Mon Jun 14, 2010 9:51 am
Has thanked: 0 time
Been thanked: 0 time

Re: SVC Dump

Postby Robert Sample » Mon Jun 14, 2010 3:43 pm

Research the MVS Initialization and Tuning Reference and Guide manuals for details about the dump data sets. This message normally shows up on our system when there have been repeated dumps and the dumps are not being cleared quickly enough. And as far as I know (the manuals will tell you for sure), it is not possible to use a manually created "normal" data set for a dump data set -- that's the whole point of the PARMLIB definitions for dump data sets.
Robert Sample
Global moderator
 
Posts: 3719
Joined: Sat Dec 19, 2009 8:32 pm
Location: Dubuque, Iowa, USA
Has thanked: 1 time
Been thanked: 279 times

Re: SVC Dump

Postby Antonyraj85 » Mon Jun 14, 2010 4:52 pm

Dear Robert,

Thanks a lot for your reply.Last question, So Is there any alternate method to get the dump. Application team needs this dump.

Thanks in Advance.
Antonyraj85
 
Posts: 79
Joined: Mon Jun 14, 2010 9:51 am
Has thanked: 0 time
Been thanked: 0 time

Re: SVC Dump

Postby Robert Sample » Mon Jun 14, 2010 5:17 pm

I would have to do some research -- 9 GB for a dump is extremely large; I'm not sure if a dump is allowed to span packs or not. I know it is possible to use more than one pack for dumps but I'm not sure whether individual dumps can span packs.

Your best bet would be to open an issue with IBM's DB2 team about the dump size and how to capture it.
Robert Sample
Global moderator
 
Posts: 3719
Joined: Sat Dec 19, 2009 8:32 pm
Location: Dubuque, Iowa, USA
Has thanked: 1 time
Been thanked: 279 times

Re: SVC Dump

Postby Antonyraj85 » Mon Jun 14, 2010 6:00 pm

Thanks a lot.I will raise the same to IBM's DB2 Team.
Antonyraj85
 
Posts: 79
Joined: Mon Jun 14, 2010 9:51 am
Has thanked: 0 time
Been thanked: 0 time

Re: SVC Dump

Postby Antonyraj85 » Wed Jun 16, 2010 3:34 pm

Hi Robert,

I gone thru one IBM z/OS Tech Doc, regarding SVC Dump, it is mentioned ,

Do not specify multi-volume dump data sets as they are not supported by z/OS

Applies to both SYS1.DUMPxx and dynamically allocated data sets Can cause SWA corruption in the DUMPSRV address space
Would only support a dump of 65535 TRKS, any dump requiring more than this would have a
dynamic allocation failure

IGD17051I ALLOCATION FAILED FOR DATA SET LSE.SVC090.AMMLP1.D020321.T10
4951.S00002 , PRIMARY SPACE EXCEEDS 65535 TRKS

Thanks
Antonyraj85
 
Posts: 79
Joined: Mon Jun 14, 2010 9:51 am
Has thanked: 0 time
Been thanked: 0 time


Return to Other IBM Tools