Search found 53125 matches

by ray.wurlod
Fri Jun 15, 2007 3:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: if then else issue
Replies: 18
Views: 3947

Trim() will get you to a single space. Convert() can get to "".
by ray.wurlod
Fri Jun 15, 2007 3:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: sample stage output problem
Replies: 11
Views: 2628

Wrong. You expect every 10th row.
by ray.wurlod
Fri Jun 15, 2007 3:29 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Trim() Function
Replies: 7
Views: 2383

This really begs a separate thread. On the server forum. Investigate the Compare() function. Decide your business rule for casing in the output and generate that.
by ray.wurlod
Fri Jun 15, 2007 3:21 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Compile a Job
Replies: 3
Views: 1168

View Generated OSH must be enabled in Administrator. <color=blood> Premium membership costs only a few cents per day, and is 100% devoted to the bandwidth costs of DSXchange. So if you want to see this site continue, please sign up. And you get to read all of the premium posters' posts. </color>
by ray.wurlod
Fri Jun 15, 2007 3:18 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: 64bit hased file bad performance
Replies: 22
Views: 4848

How much benefit is there to using a Distributed Hashed File in this situation? I've got a similar issue where multiple copies of a particular hashed file are being created by a MI job that mod's the input stream and distributes the result across the X hashed files. However, the final job that cons...
by ray.wurlod
Fri Jun 15, 2007 3:13 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Abnormal termination of stage
Replies: 7
Views: 5764

DS.CHECKER checks for orphaned DataStage repository objects. However in your case it did not complete correctly, apparently because an earlier run had been interrupted.

You will need to delete all references to DS_JOBS.cleanup (including the VOC entry if any) before proceeding with DS.CHECKER.
by ray.wurlod
Fri Jun 15, 2007 3:08 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Creating index
Replies: 4
Views: 1301

That's not corrupted. That's because you have imported only the executable component, not the design component.

Indexing/reindexing will not help.
by ray.wurlod
Fri Jun 15, 2007 3:06 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: if then else problem
Replies: 4
Views: 1085

... the ever-hasty guruji ...
by ray.wurlod
Fri Jun 15, 2007 3:06 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: An Attempt to deploy a project using a shell script
Replies: 8
Views: 3118

:idea: A company like Peoplesoft would have exactly such a requirement.
by ray.wurlod
Fri Jun 15, 2007 3:03 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Multiple Instance Job
Replies: 4
Views: 1687

As everyone knows, a DataStage server machine is capable of delivering an infinite amount of CPU, memory and disk resources on demand. Therefore you can run as many instances as you wish. In practice, however, you may find my opening assertion to be somewhat short of the mark, so pay attention to th...
by ray.wurlod
Fri Jun 15, 2007 3:00 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: if then else problem
Replies: 7
Views: 1814

Is this in a routine or in a derivation expression? Your answer to that will affect what is the correct answer to your question.
:roll:
by ray.wurlod
Fri Jun 15, 2007 2:59 pm
Forum: General
Topic: Routines - When must they be recompiled?
Replies: 19
Views: 5740

Welcome aboard. Please give examples. Recompiling a Routine overwrites the old object code, so the only way it could be executed is by a process that was already using it when the Routine was recompiled. You only need to recompile a Routine if its source code is changed. You should only recompile a ...
by ray.wurlod
Fri Jun 15, 2007 2:56 pm
Forum: General
Topic: Caught ORCHESTRATE
Replies: 2
Views: 1117

Or simply inspect the record schema. Open the table definition record in the Repository, choose the Layout tab, and select the Parallel option.
by ray.wurlod
Fri Jun 15, 2007 2:55 pm
Forum: General
Topic: Problem reading EBCDIC file using CFF stage
Replies: 1
Views: 907

Read Chapter 10 of the Parallel Job Developer's Guide in which it asserts that OCCURS DEPENDING ON is supported.

To manage the varying data type, you may need to create a custom, hybrid FD with appropriate REDEFINES, rather than trying to shoehorn four different copybooks into one stage.
by ray.wurlod
Fri Jun 15, 2007 2:51 pm
Forum: General
Topic: A Generic/Base job to handle multiple file types
Replies: 1
Views: 751

Use 25 separate jobs appropriate to the 25 different sets of metadata (record schemas). It's more maintainable in the long run.