Search found 30 matches

by manu.dwhds
Thu Jul 11, 2013 6:20 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Write to hash file is taking long time Nearly 13hr
Replies: 4
Views: 3062

Create a file --Not selected hence options disabled to chosse hash method .
Update action --Clear file before writing .
Key column :varchar(255) not null

Please suggest best options to write the data into hash .

Regards,
Mano
by manu.dwhds
Thu Jul 11, 2013 1:28 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Write to hash file is taking long time Nearly 13hr
Replies: 4
Views: 3062

Write to hash file is taking long time Nearly 13hr

Dear DS Guru's , I have an issue in production which one of the jobs taking 13hr to write the data into hash file . Job design :extract the data from Teradata stage and transformer(droping one column out of 4 columns ) and writing 3 columns into hash file . write mode :clear file before writing Tera...
by manu.dwhds
Wed Apr 17, 2013 6:01 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need to find elapsed time from start time and end time
Replies: 2
Views: 1457

Need to find elapsed time from start time and end time

Dear Experts , Could you please suggest me a function in transformer to find diffrence between start time and end time as elasped time . For Example : Column A Column B Completion Time 18/04/2013 8:54:09 18/04/2013 8:55:35 00:01:26 I need to find the completion time in trasformer . Timely help is mu...
by manu.dwhds
Wed May 02, 2012 4:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: main_program: There are no sort keys in the dataset schema.
Replies: 2
Views: 2715

main_program: There are no sort keys in the dataset schema.

HI My dear friends , I have facing below warning at delta job and am not sure how to remove this warning ,but no data issues phased due to this ,but customer ask me to remove the warning . My Job design : i am doing delta with CDC then based on change code i am differing inserts and updates after th...
by manu.dwhds
Tue Jan 31, 2012 6:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DeadLock Issue with T Connector Stage
Replies: 3
Views: 2226

DeadLock Issue with T Connector Stage

Dear All, I am facing dead lock issue with tconnector stage with update (Tpump) ,the details are as follows, My design: SeqFilestage---->XFm----->Tconnector(Upsert) this is current prodution job design with dead lock issue after that i have splited into 2 jobs First job will update the existing tabl...
by manu.dwhds
Mon Aug 08, 2011 2:10 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata connector issue while loading the data
Replies: 5
Views: 5303

Re: Teradata connector issue while loading the data

HI , Please find my teradata connector settings while loading the data Total source count 17 Laksrecords in the sequntial file Write Mode: Insert Access method:Bulk Table action :Append Bulk access:Load Error table 1:Yes,as parameter Error table 2:Yes ,as parameter Log table:yes as parameter Work ta...
by manu.dwhds
Thu Aug 04, 2011 11:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata connector issue while loading the data
Replies: 5
Views: 5303

Share any ideas on the same PLease
by manu.dwhds
Wed Aug 03, 2011 3:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata connector issue while loading the data
Replies: 5
Views: 5303

Teradata connector issue while loading the data

HI , My job design Seqfile stage--->transformer-------teradata connector - |----------Aggstage---Teradata connector. Load Type:Insert/Load I am loading the data in production around 2 lks records ,but its not loading throughing below error>after running long time the below error . [IIS-CONN-TERA-005...
by manu.dwhds
Thu Apr 07, 2011 7:34 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DRS stage:Inserted value too large for column ,row rejected
Replies: 4
Views: 4088

DRS stage:Inserted value too large for column ,row rejected

Hi , I am trying to load data into oracle using DRS stage Upsert mode ,update then insert,while loading rows get rejected warning like "Inserted value too large for cloumn"IPC_out,But I have compared the columns and data type sizes both are in sync,still iam getting error and also i tried ...
by manu.dwhds
Tue Feb 15, 2011 11:10 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading two different delimeters in a single flat file
Replies: 4
Views: 8845

Thanks for reply,
I have flat file it containes data few of the rows as comma delimter and few of the data as pipe delimeter when i read using sequntial file stage its not reading properly ,please let me know how to set the property in sequntial file stage.
by manu.dwhds
Tue Feb 15, 2011 10:56 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading two different delimeters in a single flat file
Replies: 4
Views: 8845

Reading two different delimeters in a single flat file

HI ,
Please any body let me know how to read two diffrent delimters conatines single file like (Ex:my flat file containes Pipe and comma).
Please suggest using which stage we can perform this reading opeartion .
by manu.dwhds
Thu Jun 24, 2010 12:47 am
Forum: General
Topic: Reg:Runtime columns population with RCP columns match
Replies: 3
Views: 2394

What is your source? From where these extra two columns(lnk.acc & lnk.amt) are sourced? :? Can you explain your entire job design in more detail? One source was flatfile and one join stage was dataset these all are rcp enabled but while loading in target seqential file in the transformer i have...
by manu.dwhds
Thu Jun 24, 2010 12:17 am
Forum: General
Topic: Reg:Runtime columns population with RCP columns match
Replies: 3
Views: 2394

Reg:Runtime columns population with RCP columns match

Hi , I have an issue with rcp enabled my design like I have enabled rcp for the job but here for two columns validation will be there like If lnk.acc=0 then lnk.acc else lnk.amt these two columns for each integartion was diffrent i have to pass runtime where rcp enabled this job was Generic for all ...