Page 1 of 1

Write to ORA_BULK stage

Posted: Thu Jan 26, 2006 4:14 am
by murphydt
Hi, When I write + 1 million records to an Oracle Bulk stage it falls over with the following error codes. I doesnt get to the point where it kicks off sqlldr. Has anybody come across this ?

From previous run
DataStage Job 47 Phantom 6582
jobnotify: Unknown error
DataStage Phantom Finished.
[6593] DSD.StageRun CpCustMeasuresTransforms. CpCustMeasuresTransforms.CTransformerStage3 1 0/0 - core dumped.

From previous run
DataStage Job 47 Phantom 6593
Abnormal termination of DataStage.
Fault type is 10. Layer type is BASIC run machine.
Fault occurred in BASIC program ORABULK.RUN at address a9

Posted: Thu Jan 26, 2006 4:46 am
by ArndW
The dump is coming from the Transform stage, not the bulk load stage. If you put a constraint "1=2" in the transform so that no rows are passed to the ORA passive stage does the error still occur?

Posted: Thu Jan 26, 2006 8:09 am
by chulett
What version of DataStage? Are you running this in Manual or Automatic mode?

Posted: Thu Jan 26, 2006 9:34 am
by murphydt
ArndW wrote:The dump is coming from the Transform stage, not the bulk load stage. If you put a constraint "1=2" in the transform so that no rows are passed to the ORA passive stage does the error still occur?


No it doesnt. It only occurs with large amounts of records. I can pass smaller result sets with no problems.
I can write to a sequential file with no problems from the same transform with 4million records. I have workarounds but I'd rather know what is causing the problem.

Posted: Thu Jan 26, 2006 9:37 am
by ArndW
As Craig has mentioned, switching modes in Oracle might make a difference or produce a better error message.

Posted: Thu Jan 26, 2006 9:37 am
by murphydt
Datastage Version is 7.1.
It is the older ORA_BULK type stage so there is no manual or automatic option.

Posted: Thu Jan 26, 2006 9:42 am
by chulett
Interesting.... the ORABULK stage in 7.0.1 had both modes. :?

Can you go into Manager, find the stage listed under Stage Types and let us know the date and version number on it for grins?

Posted: Thu Jan 26, 2006 9:44 am
by ArndW
At Version 7.1 there is an automatic/manual option.(I have my 7.1 designer open at the screen that has it...)

Posted: Thu Jan 26, 2006 9:54 am
by murphydt
ArndW wrote:At Version 7.1 there is an automatic/manual option.(I have my 7.1 designer open at the screen that has it...)

Not on the Oracle 7 Loader in the version that I have there isnt.
In manager:
The stage name is Oracle 7 Loader, description is Bulk Loader the version is 1.0.0.

Posted: Thu Jan 26, 2006 10:22 am
by ArndW
I am using the Oracle 8.1 Bulk Loader, which also comes with the product at version 7.1.
In your version 7 loader, specify a control file and 2 data files and see if they are generated; you can then run them from UNIX to see if you get a better error message.

Posted: Thu Jan 26, 2006 10:28 am
by murphydt
The datafile and control file are created but if falls over during population of the datafile. It isnt an oracle issue as it doesnt seem to get to the point where it calls SQLLDR i.e the error comes from the transform stage.

Posted: Thu Jan 26, 2006 10:33 am
by ArndW
So you had DS create more than 1 data file and an explicit control file and it still failed? I am surprised that it failed and that you managed to reproduce it so quickly. It looked like a memory issue caused by getting too much data in one file.

You can debug your core file to get a detailed cause; but that is more work. You need to have dbx or a similar tool installed and can use that with the core (dumped into your project directory) and the dssh executable (perhaps another, but your core header file will tell you which program you need to use).

Yet another option is to use the VLIST utility within UniVerse/DataStage to look at the instruction at hex address A9 in the ORABULK.RUN program.

Posted: Thu Jan 26, 2006 10:34 am
by chulett
Are you actually trying to bulk load an Oracle 7.x database? You may need to contact Support on this, I don't recall ever actually having used the stage in question... sorry.

Posted: Thu Jan 26, 2006 10:48 am
by murphydt
I havent used it to create two datafiles. I can try this.
I have plenty of workarounds. The reason that stage is used is a historical one that I dont know. It is used a lot elsewhere in the warehouse as well even though it is a 9i DB that gets written to. This problem has only started happening recently. My next aproach is to upgrade the stage to the newer version for bulk loading Oracle 9 db and re-test
Ill let you know how I get on.
Thanks for your help.