Search found 6797 matches

by DSguru2B
Mon Dec 04, 2006 6:47 am
Forum: Site/Forum
Topic: No Limits.
Replies: 11
Views: 5632

Yup. These guys are way out of our league :wink:
Congrats Craig, did it before the holidays huh !
And as we all know, Ray is the man !!!
Hats off to you guys 8)
by DSguru2B
Fri Dec 01, 2006 1:48 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help with "timestamp" for OCI-9
Replies: 41
Views: 11900

Thats not long winded.
This is long winded :wink:

Code: Select all

Left(in.Col,10):' ':Ereplace(Left(Right(in.Col,12),8),".",":")
by DSguru2B
Fri Dec 01, 2006 1:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: I want to merge 2 inputfiles into one
Replies: 11
Views: 2457

Is that your final output or you will be needing to use this file for further processing. If yes for further processing then add the extra column in the second file and use a funnel stage to cat both of these files. If this is your final output then just cat it at the unix level by cat file1 file2 >...
by DSguru2B
Fri Dec 01, 2006 1:08 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help with "timestamp" for OCI-9
Replies: 41
Views: 11900

Maybe that particular column's derivation is equivalent to what you have specified in red. [shrug] :roll:
How are you doing your third timestamp parsing. Is it via a routine. Make sure you are calling that routine in the third timestamp's derivation.
by DSguru2B
Fri Dec 01, 2006 1:03 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: what is don't check point run
Replies: 2
Views: 813

If you check that option in a restartable job sequence. That particular job activity will execute regardless every time the sequence is restarted after failure. For example a job that builds hashed files. You need to refresh that job because new lookup values keep pouring in. You check the option of...
by DSguru2B
Fri Dec 01, 2006 11:37 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: design consideration
Replies: 7
Views: 2028

Size of the hashed file does matter. By default they have a limit of 2.2 gigs. You can break that barrier by making it 64 bit. Search this forum for how to. Is the entire sql query too long that it doesn work in user defined space. Try it by running the sql from datastage and see the outcome. Get yo...
by DSguru2B
Fri Dec 01, 2006 11:26 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ExecDos
Replies: 15
Views: 3995

Do you guys have mkstoolkit installed or a third party tool like samba that makes unix visible to windows and viceversa. That might have something to do with it :roll:
by DSguru2B
Fri Dec 01, 2006 11:25 am
Forum: General
Topic: How do I concatenate two input Character fields
Replies: 19
Views: 8927

Make sure you have properly defined the LastName variable as char or varchar.
by DSguru2B
Fri Dec 01, 2006 11:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Maximum performance available in DS server for a lookup
Replies: 10
Views: 2974

As Arnd pointed out, the rows/sec is meaningless and platform dependent. In other words, it all depends upon whats under the hood. If this conerns you and both your source and lookup are in the same database, just pass a sql command to do the lookup. That will be much faster, provided sucha move is ...
by DSguru2B
Fri Dec 01, 2006 7:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Function Row_row_to_buffer failed means
Replies: 2
Views: 676

Load the output to a file and then in a second job just have a straight load between the file and the bulk loader. See if the error persists.
Is that the only error message you are getting. What messages are there in the message file. See if any additional messages are there.
by DSguru2B
Thu Nov 30, 2006 7:58 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: generating jobs reports
Replies: 60
Views: 24753

Great work Kim. I like your patience and step by step explanation. You can really make a poster feel comfortable. Keep it up :)
by DSguru2B
Thu Nov 30, 2006 6:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fixed Width File
Replies: 5
Views: 1236

Simplest way would be to define the target file as fixed width and specify the fields as char. That should take care of it.
by DSguru2B
Tue Nov 28, 2006 5:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timeout in DS
Replies: 13
Views: 2670

I dont know if enterprise sql server stage has that then. I can only confirm that tomorrow after looking into that stage. If someone else has access to it, can shed more light on it. Have a simple job before your job that just truncates the table. You can pass that in user defined sql.
by DSguru2B
Tue Nov 28, 2006 4:29 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timeout in DS
Replies: 13
Views: 2670

In the DRS stage where you have the generated sql, look at the tab before it, it will say before. You can add a truncate statement there.
by DSguru2B
Tue Nov 28, 2006 4:17 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timeout in DS
Replies: 13
Views: 2670

Delete is a logged activity. That means if you have x amount of rows then x number of activities in the database log which has takes its own time, which inturn adds time to your process. Simply do a truncate table in the before sql tab and insert without clearing. That will help.