Search found 15603 matches

by ArndW
Mon Jul 30, 2007 2:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Target Load strategy
Replies: 12
Views: 5160

Satish - I wouldn't say that your approach is wrong, but it might be over-engineered for what the original poster is looking for. We will need to hear back to see what it is they wish to do.
by ArndW
Mon Jul 30, 2007 2:04 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Use of Union or Link Collector
Replies: 2
Views: 990

I think it might be faster doing 3 concurrent SELECTs and then joining the 3 streams in a link collector stage. The answer isn't definitive as it depends upon your DB configuration as well as CPU and IO specifics (also whether or not the DB is on the same machine as your DataStage server). Nominally...
by ArndW
Mon Jul 30, 2007 12:11 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Config Files
Replies: 1
Views: 1174

The APT_CONFIG_FILE and its contents is central to PX. It is documented both in the Parallel Job Programmer's Guide and in the Parallel Job Advanced Programmer's Guide. I know of no environment variable called "APT_ORACLE_NO_OPS ", do you perhaps mean the "APT_ORACLE_LOAD_OPTIONS"?
by ArndW
Sun Jul 29, 2007 5:21 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job control process (pid 6268) has failed
Replies: 2
Views: 1560

kasgang - that is not sufficient information, it is like saying "sometimes my car won't start. How do I fix it". The pid number doesn't mean anything, so we will need additional data about the job and the aborts.
by ArndW
Sat Jul 28, 2007 10:01 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Target Load strategy
Replies: 12
Views: 5160

If you are doing a normal insert (versus a load) into your database, you can just stick with a simple INSERT and just not look for errors (which would happen if you attempted to insert into an existing key), this is probably easier than doing a lookup - although it depends upon the relative percenta...
by ArndW
Sat Jul 28, 2007 7:51 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: unable to trim
Replies: 13
Views: 2956

This is the first mention of a varchar RHS column. If you explicitly TRIM that column and output it to a peek stage are the spaces gone? And then they re-appear when writing to your database? What database are you using?
by ArndW
Sat Jul 28, 2007 3:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Regarding unzip in WinNT
Replies: 24
Views: 8223

Ray - I wouldn't bother trying to suggest using FTP, for some reason he has ignored all previous posts in this thread suggesting that approach.
by ArndW
Sat Jul 28, 2007 3:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Regarding unzip in WinNT
Replies: 24
Views: 8223

If you use a varying length datatype make it bigger, since you aren't guaranteed that gzip/compress will encode a <cr> within 2000 bytes (although it is likely). I used a fixed length field such as CHAR(128) for my compressed FTP stage and ignored the error on the last row read in. You can also use ...
by ArndW
Sat Jul 28, 2007 12:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: unable to trim
Replies: 13
Views: 2956

You cannot TRIM a char field. It has fixed width and will always blank pad to the fixed length. You need to declare a VARCHAR column datatype and then TRIM your original CHAR column into that.
by ArndW
Sat Jul 28, 2007 12:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Regarding unzip in WinNT
Replies: 24
Views: 8223

Are you still attempting to FTP the file using DataStage? What have you defined as your FTP column(s) in the FTP stage?
by ArndW
Fri Jul 27, 2007 9:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: unable to trim
Replies: 13
Views: 2956

What is the datatype from your dataset schema? Add a 2nd output link to your transform going into a peek stage with your TRIM() string and add a column of LEN(trimmed-string) and see if the results are as expected.
by ArndW
Fri Jul 27, 2007 8:53 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: unable to trim
Replies: 13
Views: 2956

The trim function will always remove spaces from strings, so either you don't have spaces but other characters or you don't have strings - note, you cannot remove trailing spaces from a CHAR field, just from VARCHAR. What are your datatypes in the job and the table? Are you doing a TRIM() on CHAR fi...
by ArndW
Fri Jul 27, 2007 7:10 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Build stage doubt
Replies: 1
Views: 570

I use shared containers with RCP that accept all sort of different column contents and count in different jobs, acting only on the columns used in that container. Usually I will pass in the names of the columns as parameter and then create/remove temporary work columns in modify stages in the shared...
by ArndW
Fri Jul 27, 2007 4:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ORACLE Load vs Upsert
Replies: 1
Views: 828

The throughput of both options depends a lot on your database and table configuration and using LOAD vs. a more traditional UPSERT is not always the best solution. With a partitioned database you can get real parallel loading to separate partitions if you code your job correctly and that can signifi...
by ArndW
Fri Jul 27, 2007 4:15 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Key Expression in transformer for Look Up
Replies: 7
Views: 2405

swades - you haven't stated if you are retrieving data from a hashed file or a database in your lookup. As mentioned before, if it is a database then you can put your OR expression into the SQL lookup. If you are referencing a hashed file then you need to do 2 lookups and then perform the OR logic i...