Search found 42189 matches

by chulett
Mon Aug 01, 2011 3:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: join_Hash_Value,3: Write to dataset on [fd 13] failed????
Replies: 3
Views: 1326

I'm thinking "hashed field" rather than "hashed file" and that they are referencing the partitioning.
by chulett
Mon Aug 01, 2011 2:06 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Extracting Long Month Name from Date
Replies: 24
Views: 12124

Post some examples of what your incoming date values look like.
by chulett
Mon Aug 01, 2011 10:08 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to export multiple jobs via command line
Replies: 1
Views: 1744

One at a time or the entire project, those seem to be our only choices.
by chulett
Mon Aug 01, 2011 9:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: The call to sqlldr failed; the return code = 2;
Replies: 4
Views: 6528

That or make sure you filter out any "duplicates" first.
by chulett
Mon Aug 01, 2011 6:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: The call to sqlldr failed; the return code = 2;
Replies: 4
Views: 6528

You can't use sqlldr to insert duplicate records, as you've found they must all be "new" for the bulk loader to work properly.
by chulett
Mon Aug 01, 2011 6:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: string to date
Replies: 8
Views: 2443

Hmmm... DATE datatypes don't have formats, people. :?
by chulett
Mon Aug 01, 2011 6:23 am
Forum: General
Topic: DataStage Client 7.5.3 will support DataStage server 7.5.1
Replies: 2
Views: 1555

Check with your official support provider but I believe any 7.5.x can talk to any 7.5.x. I used to believe the same thing. 'Talk' is pretty generic and yes, they can talk but I was in the exact same situation back in the day and had... issues. When I called support, the answer I got surprised me - ...
by chulett
Sun Jul 31, 2011 10:49 pm
Forum: General
Topic: cannot find entries in .odbc.ini file to acces oracle DB
Replies: 3
Views: 1752

Entries don't typically need to carry the db/user/password information as that's provided at runtime to any job that uses them via job parameters. And native stages don't use ODBC.
by chulett
Sat Jul 30, 2011 9:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: converting date
Replies: 13
Views: 4848

As noted. :wink:
by chulett
Sat Jul 30, 2011 9:26 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: XML Stage Abnormal termination of stage transformer_name de
Replies: 9
Views: 4305

So, in other words... there wasn't one? The log message should specifically say "From previous run..." and will add additional diagnostic information, otherwise we're all in the dark and pretty much just guessing.
by chulett
Sat Jul 30, 2011 9:17 pm
Forum: General
Topic: Need to pass a parameter across all jobs
Replies: 9
Views: 12116

No, I've used it back on a 7.5.1a system, so I'd say in the 7.5 release.
by chulett
Sat Jul 30, 2011 2:00 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: XML Stage Abnormal termination of stage transformer_name de
Replies: 9
Views: 4305

<Begin Standard Message>

Reset the aborted job and let us know the contents of a "From previous run..." log entry if one shows up when you reset it.

<End Standard Message>
by chulett
Sat Jul 30, 2011 10:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DataStage variables
Replies: 4
Views: 5717

"User variables" would need to be passed in as job parameters. You can't update job parameters, rather you use them in other derivations who's values can change.
by chulett
Sat Jul 30, 2011 10:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: converting date
Replies: 13
Views: 4848

No, it can't work with nulls, so as Ravi posted you need to handle those first.