Page 1 of 1

Segmentation Fault with a core dump

Posted: Mon Jul 12, 2004 9:33 am
by vjreddy65
All,
I have a custom operator being used in my job. Iam getting an error "Contents of phantom output file=> RT_SC524/OshExecuter.sh[16]: 65112 Segmentation fault". Soon after this it coredumps. Do anyone has any idea, what it is?? Does it have to do anything with the custom operator Iam using?

Any idea is apreciated.

-vj

Posted: Tue Jul 13, 2004 4:28 pm
by gh_amitava
Hi,,

Are you using Basic transformer in your job design which is not suggested in PX environment..

Regards
Amitava

Re: Segmentation Fault with a core dump

Posted: Wed Jul 14, 2004 7:50 am
by Eric
This sounds like it is connected with the custom operator. Perhaps there is a row of data that is not formed correctly and the operator does not know how to handle it?

Contents of phantom output file

Posted: Thu Jul 15, 2004 11:26 am
by gleblanc
I have the same problem, a job that ran successfully yesterday and is failing today.

Job has :
Sequential file - linkto - Transformer stage - linkto - TDMLoadPXStage

resulting with following messages :

Contents of phantom output file =>
RT_SC33/OshExecuter.sh[16]: 29445 Memory fault
Contents of phantom output file =>
DataStage Job 33 Phantom 29446
Parallel job reports failure (code 139)

Any idea ?
tks
Gilles

Posted: Fri Jul 16, 2004 3:55 am
by leo_t_nice
Hi

How long did this job run for?

Did it fall over immediately, or did it process a number of rows (and if so, how many). I had a similar problem (though i forget the exact message, it did involve segmentation violation) when using the TDMLoad stage. In our case (PX 7.01, HP-UX) it was caused by an apparent memory leak. Try running "top" on your server and watch the memory used by your job. We could run until the memory used by the job was around 800Mb, then bang!

Our solution was to use a server shared container with the TDMLoad stage. The performance is pretty good and the job doesnt fail :)

Hope this helps