Search found 53125 matches

by ray.wurlod
Mon Sep 19, 2005 1:11 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job aborting using Link Collector
Replies: 7
Views: 1287

And the executing user is dstage or a member of the oasadmin group?

(Have to ask, even though the same process ought to create and read from the file this might not be the case if, for example, you're forcing inter-process row buffering.)
by ray.wurlod
Mon Sep 19, 2005 1:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Remove duplicate stage behaviour
Replies: 9
Views: 5534

There is a sort performed on the input link of the Remove Duplicates stage (unless data are already sorted appropriately), because having this situation means that far less memory can be consumed by the stage. As soon as any of the key columns changes values in sorted data, it is known that that val...
by ray.wurlod
Mon Sep 19, 2005 1:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Orchestrate script calling issue
Replies: 12
Views: 4539

Doubt it. Looks like recursive calls to osh (one being your job, the other being the invocation from a Wrapper stage) aren't permitted. But knowing that will probably be valuable one day. What happens - even though you'd need intermediate persistent data set - if you fork a separate, background osh ...
by ray.wurlod
Sun Sep 18, 2005 8:19 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job aborting using Link Collector
Replies: 7
Views: 1287

Somehow Solaris has reported that either the file /tmp/newcisprd.P423LoadCOCDSContCompCorr00..CTransformerStage98-Transform.AcctPerOut or one of the directories in its pathname (here only /tmp) could not be found. This is most usually because it hasn't been created in the first place, which is usual...
by ray.wurlod
Sun Sep 18, 2005 7:38 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Load date into SQL Server - datetime datatype)
Replies: 10
Views: 13061

Code: Select all

Oconv(InLink.Transaction_Date, "D-YMD[4,2,2]") : " 00:00:00.000"
by ray.wurlod
Sun Sep 18, 2005 7:36 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job aborting using Link Collector
Replies: 7
Views: 1287

Is the file system on which /tmp is mounted full? You can see that the name of the file that is trying to be used in /tmp is derived from the job name, stage name and link name. Does this create too long a name for your particular UNIX? I am assuming in all of this that you have write permission to ...
by ray.wurlod
Sun Sep 18, 2005 4:37 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: LongVarChar source and DS job fails to recognize it!!!
Replies: 15
Views: 5510

If you can tolerate data type mismatch warning, try VarChar with precision up to 65536.
by ray.wurlod
Sun Sep 18, 2005 4:35 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Scheduling a job in intervals of X minutes using director
Replies: 4
Views: 1398

The "scheduler" window in Director isn't a scheduler in its own right - it's merely an interface to the operating system scheduler (cron or at) and is limited by the lowest common set of functionality of those schedulers.
by ray.wurlod
Sun Sep 18, 2005 4:32 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Calculator - Record Size incl or excl Record Id?
Replies: 1
Views: 817

Welcome aboard! :D You should supply the total logical size of the record (key and data) to HFC. The tool will calculate the physical storage overheads which are: three 32-bit or 64-bit pointers one character between key and data padding to a whole 32-bit or 64-bit boundary The key and data sizes yo...
by ray.wurlod
Sun Sep 18, 2005 4:27 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: spilt the record into three records
Replies: 13
Views: 3336

Don't forget that the input will require to be partitioned and sorted on the key column(s) for this technique to work properly.
by ray.wurlod
Sun Sep 18, 2005 4:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture job status in Job sequencer
Replies: 4
Views: 1325

You can explicitly handle the result of a job through an additional trigger from the Job Activity. If you don't, you can include an Exception Handler that will fire if any job aborts.
by ray.wurlod
Sat Sep 17, 2005 6:28 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: spilt the record into three records
Replies: 13
Views: 3336

Welcome aboard! :D What you are describing here is called a "horizontal pivot" and can be performed by a Pivot stage (its stage type for parallel jobs is PivotPX). Unfortunately you will not find this stage type in the DataStage manuals or in on-line help. It seems to have been constructed as a Buil...
by ray.wurlod
Sat Sep 17, 2005 6:10 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capture job status in Job sequencer
Replies: 4
Views: 1325

The exit status of a job is captured downstream of that Job Activity (that is, you must have a path of links from the Job Activity to where you are proposing to use the job status). It is available in expressions as the "activity variable" JobActivityName.$JobStatus which can be chosen from the expr...
by ray.wurlod
Sat Sep 17, 2005 6:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Remove duplicate stage behaviour
Replies: 9
Views: 5534

Clearly what is happening is that the different partitioning algorithm is able to direct duplicate rows to different processing nodes. When you specify "same", you use the (non-partitioned) sequential processing method - I'm making some assumptions about your Sequential File format here of course. W...
by ray.wurlod
Fri Sep 16, 2005 4:34 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequencer aborting when it calls a routine.
Replies: 7
Views: 1806

Code -4 means that the value you have supplied (presumably the return value from the routine) is not appropriate for the parameter type when you're requesting the run of a job.

For example, "" is not appropriate for Integer, Date, Time, Pathname parameter types.