Search found 15603 matches

by ArndW
Tue Apr 22, 2008 10:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Execution Order
Replies: 5
Views: 1302

You can only control the order by the way you design the job as DataStage will try to execute everything concurrently. If stages have dependancies then those will determine the execution order, i.e. if you create a dataset and use that as a lookup then that branch will have to complete before the fi...
by ArndW
Tue Apr 22, 2008 7:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Interpreting Record format
Replies: 12
Views: 3244

Vino_joe84 wrote:It shows the warning.
You've mentioned this twice now, but I can't find a reference to what the warning actually is.
by ArndW
Tue Apr 22, 2008 7:14 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Capture DS Log Info in columns
Replies: 1
Views: 784

The relevant BASIC calls that you will need to read up on to do this are:

DSAttachJob()
DSGetJobInfo()
DSGetLinkInfo()
DSGetStageInfo()
DSDetachJob()
by ArndW
Tue Apr 22, 2008 7:10 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to pass the log event details to a file?
Replies: 2
Views: 904

You could do it directly with

Code: Select all

OPENSEQ '/path/to/file.txt' TO OutFilePtr THEN WEOFSEQ(OutFilePtr) ELSE NULL
WRITESEQ SummaryArray TO OutFilePtr ELSE CALL DSLogFatal('Bad write','')
CLOSESEQ OutFilePtr
by ArndW
Tue Apr 22, 2008 7:08 am
Forum: General
Topic: Datastage Transform
Replies: 6
Views: 4060

You have specified "Parallel" by wish to write a function [not a transform] in BASIC - that would be for a Server job. Which is it? The BASIC function would be straightforward, using a DSExecute() call to the UNIX call of your choice to see if the file exists, then parsing the result and returning a...
by ArndW
Mon Apr 21, 2008 1:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Warning:Dataset has been truncated
Replies: 3
Views: 744

It would seem that one of your UNIX files that contain has been deleted. With the dataset administration tool check the files that should be there and determine if one of them is in fact issing.
by ArndW
Fri Apr 18, 2008 6:55 am
Forum: General
Topic: Dynamically renaming variables
Replies: 8
Views: 2481

I think you might be able to leverage RCP and shared containers to do this. Declare your shard container with no columns and pass the original column name and new name as parameters to this container. Then use a modify stage to rename the column dynamically; repeat this call as often as necessary to...
by ArndW
Sat Apr 12, 2008 11:53 am
Forum: Site/Forum
Topic: How did you learn DataStage?
Replies: 34
Views: 29120

Version 0.0 Beta "Tiger Team" at VMark for me - they threw me at a customer before anyone thought of even planning to write the documentation.
by ArndW
Sat Apr 12, 2008 7:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: before job routine & after job routine
Replies: 3
Views: 4136

Before job routines get executed before a job starts and an after-job subroutine gets executed after a job successfully completes. If you have actions that need to be performed at these times then you can code your routines to do this.
by ArndW
Sun Apr 06, 2008 11:11 am
Forum: General
Topic: RE: Helper Subroutines
Replies: 14
Views: 3818

I'm afraid I'm not able to understand the first question, but the 2nd one ergarding decompiling programs is easy to answer. Yes, the programs could be decompiled. There is only one UV-based commercial decompiler as far as I know, plus a couple of self-written ones. The compiled pseudo-code for some ...
by ArndW
Sun Apr 06, 2008 8:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Running Zipped File in DataStage
Replies: 7
Views: 7181

Craig - actually you can read & write compressed files by using the filter option on sequential files. When reading a gzipped file use the filter "gunzip -c" and when writing use "gzip -c". This can actually result in faster throughput for some jobs - particularly when there is excess CPU capaci...
by ArndW
Sat Apr 05, 2008 10:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Premature EOF on node
Replies: 4
Views: 3009

Using the load option you will also have UNIX log files from oracle. There are probably other warnings and errors in addition to the one you posted, and the Director log will tell you which directory was used for the Oracle log files.
by ArndW
Sat Apr 05, 2008 5:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 error
Replies: 4
Views: 876

Ramesh - you've marked this as having a "workaround" - does that mean you found the problem?
by ArndW
Sat Apr 05, 2008 5:04 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performamnce of compilation
Replies: 1
Views: 526

try compiling a job with no transform stage to get a baseline compile time. Those numbers are absolutely atrocious and I couldn't work on a system like that. At my current site we have a big job that cannot be split due to technical reasons that takes 10-15 minutes to compile but that is the worst I...
by ArndW
Sat Apr 05, 2008 4:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Premature EOF on node
Replies: 4
Views: 3009

What Oracle load method are you using?