Search found 126 matches

by Nagac
Tue Jul 22, 2014 9:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Connector - Design time information error
Replies: 3
Views: 3004

ODBC Connector - Design time information error

Hi, I have parallel job which uses Schema file and Run Time Column Propagation functionality. Job Design as below Seqential File-->ColImport-->Transformer-->ODBC Connector. It is loading the data but it is throwing warnings for each field in Schema File. Just to let you know there are no transformat...
by Nagac
Mon Mar 03, 2014 11:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Encrypted File
Replies: 1
Views: 910

Encrypted File

Hi

I have requirement to load files which have been encrypted by third party applications and Keys will be provided to ETL Team. So using those keys, Datastage need to read the data from file and load into table.

Is it possible in Datastage at all?
by Nagac
Tue Aug 14, 2012 1:51 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sequential File Stage
Replies: 7
Views: 3342

Thanks Everyone.
by Nagac
Sat Aug 04, 2012 12:11 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sequential File Stage
Replies: 7
Views: 3342

Sequential File Stage

Hi Will there be any difference in performance when we read file as single column using Sequential File Stage and then use column import stage to divide the multiple columns and Reading the file as multiple columns using Sequential File Stage and do the rest of the transformation process. Thanks Naga
by Nagac
Wed Feb 29, 2012 5:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Schema File
Replies: 2
Views: 1218

Thanks Kryt0n, i thought there will be solution in Schema File.
by Nagac
Tue Feb 28, 2012 9:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Schema File
Replies: 2
Views: 1218

Schema File

Hi, I have Common Job which extracts the data from multiple files with n number of fields. In Job, we use Column Import Stage with Schema File to import into multiple fields as we initially read file as single field. When we do testing we send more no. of delimiters(|) than expected(Eg: we have two ...
by Nagac
Sat Oct 22, 2011 8:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: abort the Job
Replies: 8
Views: 3963

that works !! but I want to make the job abort when it fails to update atleast one record.
by Nagac
Fri Oct 21, 2011 3:48 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lock data Rows in table
Replies: 3
Views: 2620

Research "transaction isolation level". Thanks Ray, I had gone through the docs and found Repeatable Read Isolation level. And have few queries on this. As it says, It locks the row(whichever has been read) does it allows to update the in the same process( i mean reading as reference and ...
by Nagac
Tue Oct 18, 2011 3:27 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lock data Rows in table
Replies: 3
Views: 2620

Lock data Rows in table

Hi

I have requirement to extract data from table(which is common to many processes) and update the same table But in the mean time no other process should read the same data and update. We are doing this in single Job.

Could some one advise on this how to achieve this?

Thanks
Naga
by Nagac
Mon Oct 10, 2011 4:39 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Default value in column
Replies: 4
Views: 2916

Thanks Ray.
Correct One

record
(
Number:decimal[5,2] {cycle={init=000.00, incr=0}}
)
by Nagac
Sun Oct 09, 2011 2:18 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Default value in column
Replies: 4
Views: 2916

Try it with no space between the "]" and the "{". Thanks Ray, But still i am getting the same warning message and same data incrementing with 1. Column_Generator_12: Field Number of type decimal[5,2]:: unrecognized generator parameters ignored: . I thought of generating this fie...
by Nagac
Sat Oct 08, 2011 4:26 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Default value in column
Replies: 4
Views: 2916

Default value in column

Hi I want to create new column with default value using column generator stage using schema file. i tried with below schema file, but it has not created with default value( i gave in the schema). It has incremented by 1 starts with 0. record ( Number:decimal[5,2] {default=000.00}; ) Could someone fa...
by Nagac
Fri Sep 23, 2011 7:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Warning message : Previous component with same name
Replies: 1
Views: 1421

Warning message : Previous component with same name

Hi I have job which creates multiple datasets, One column name is same in each dataset.. and these columns are mapped by one column. Eg: input column Output Column column_name COLUMN_NAME COLUMN_NAME COLUMNA_NAME i am getting warning when i execute it. error as below. Tfm_Derivation: When checking o...
by Nagac
Fri Sep 16, 2011 6:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel Routine not compile
Replies: 1
Views: 3712

Parallel Routine not compile

Hi I have created simple c++ code with the help of Google. and compiled successfully in Unix server. Create routine in Datastage. I called the routine in Transformer and tried to compile, but it is not successful and saying that Output from transformer compilation follows: ##I IIS-DSEE-TFCN-00001 13...
by Nagac
Fri Aug 05, 2011 3:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: C++ routine guide
Replies: 6
Views: 3672

Yeah, we can do that.

But i need to gather few more link row counts as well. along with some other Audit related information.

that si the reason i am going for Routine.