Search found 30 matches

by sgubba
Fri Feb 27, 2015 3:55 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture PL/SQL anonymous block data and write to a flat file
Replies: 1
Views: 1903

Capture PL/SQL anonymous block data and write to a flat file

I have a table where i store queries my job is to read the table and execute the query and write the output to a flat file for that i have a PL/SQL anonymous block that reads the table and execute the query using EXECUTE IMMEDIATE I am able to run the PL/SQL anonymous block from connector stage but ...
by sgubba
Sat Jan 12, 2013 1:07 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Division by Zero
Replies: 4
Views: 2213

Thanks Ray

I will certainly handle that division by zero

My question is will it assign a non numeric character

I want to know if this column is the issue or
there is another column that causing the issue of inserting non numeric to a numeric column

Regards
by sgubba
Fri Jan 11, 2013 4:37 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Division by Zero
Replies: 4
Views: 2213

Division by Zero

Hi all, In one of my job I got this warning message analyzeAndPrepare:divisor is 0. I suppressed the warning and ran the job The next job which is a load job is failing with Unable to insert a record into the table due to ORA-01722: invalid number Is this error because of the divison by 0 warning Wh...
by sgubba
Tue Sep 14, 2010 3:37 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Null handling error when writing to sequential file
Replies: 0
Views: 859

Null handling error when writing to sequential file

My job calculates max and min values based on a column ....so i used a transformer with the column and added another dummy column with value 1. I use a aggregator use the dummy value column as key and then use the max value output function and min value output function. I also use allow null output ...
by sgubba
Wed Jun 23, 2010 2:00 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle Stage error
Replies: 1
Views: 1181

Oracle Stage error

Has any one encountered this error


Unable to insert a record into the table due to orchsun4.a root Exp $.
by sgubba
Fri Feb 26, 2010 5:05 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transformer Records getting dropped
Replies: 2
Views: 1894

Transformer Records getting dropped

The is a column JN_ADDRESS_OUT.RESTRICTED_ACCT_IND I am using this column at two locations and these are the transformations IF ISNULL(JN_ADDRESS_OUT.RESTRICTED_ACCT_IND) THEN SETNULL() ELSE JN_ADDRESS_OUT.RESTRICTED_ACCT_IND IF NULLTOEMPTY(JN_ADDRESS_OUT.RESTRICTED_ACCT_IND)='Y' THEN '1' ELSE '0' S...
by sgubba
Sun Feb 21, 2010 5:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Accesing Variables in Sequences
Replies: 2
Views: 1471

Accesing Variables in Sequences

I want to check if there is a method where you read a value from Oracle Table and pass that value on Sequencer I know i can do this by having a job which reads from oracle table writes to a file and then read that file using read file stage and then use that value in sequencer...... To give my scena...
by sgubba
Tue Aug 04, 2009 1:52 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading junk characters from Oracle 11g
Replies: 5
Views: 3112

Issues in the sense. I know the data from source and target are the same but in my chance capture stage they are not matching ....and all the records are comming out as inserts.....I tried to view th data in ODBC stage and its throwing junk data
by sgubba
Tue Aug 04, 2009 9:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading junk characters from Oracle 11g
Replies: 5
Views: 3112

I tried using UTF-8 while loading and reading. The job i am trying to run has a change capture stage where one source is oracle and the other is ODBC. Its kinda reconciliation job, Fisrt job is loading from oracle 10g(source) to 11g(Target) when i am loading into target i use ODBC. In my reconcilati...
by sgubba
Sun Jul 26, 2009 8:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading junk characters from Oracle 11g
Replies: 5
Views: 3112

Reading junk characters from Oracle 11g

Hi, i have a problem reading data from oracle table which is 11g. I am able to load the table with out any problem using ODBC stage. I use ASCL_ISO8859-1 character set when i am loading and have no issues . But when i am using the same character set i read them as junk characters. This is the charac...
by sgubba
Fri Jun 19, 2009 9:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Stage Speed
Replies: 20
Views: 6187

The drivers we are using support 10 G. We dont have 11g Drivers
by sgubba
Fri Jun 19, 2009 9:14 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Stage Speed
Replies: 20
Views: 6187

I am trying to insert in to Oracle 11g. Thts the reason we are using ODBC
by sgubba
Fri Jun 19, 2009 9:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Stage Speed
Replies: 20
Views: 6187

Insert Only
by sgubba
Fri Jun 19, 2009 8:15 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Stage Speed
Replies: 20
Views: 6187

Yeah,

The record length is really small. Its extremely slow ...I have no idea what to do..please suggest to improve the performance

Shyam
by sgubba
Fri May 29, 2009 12:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Stage Speed
Replies: 20
Views: 6187

Its barely 7-8 columns which amounts to 100 -150 bytes