Search found 15603 matches

by ArndW
Mon Mar 30, 2009 12:53 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: File system getting full
Replies: 5
Views: 933

What File(s) in which location(s) is/are being created and not deleted that cause the file system(s) to run full?
by ArndW
Mon Mar 30, 2009 12:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Abort After Rows - Write to Sequential File Not working
Replies: 7
Views: 4706

Could you put a transform stage between your "real" transform and the output sequential file. How many rows does the log show went through that transform stage with a setting of "0" and "1"?
by ArndW
Mon Mar 30, 2009 12:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Abort After Rows - Write to Sequential File Not working
Replies: 7
Views: 4706

Could you put a transform stage between your "real" transform and the output sequential file. How many rows does the log show went through that transform stage with a setting of "0" and "1"?
by ArndW
Mon Mar 30, 2009 12:39 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Flat file error
Replies: 13
Views: 3074

Try opening a DOS window and using the "cd {yourpath}" command - does that work? Are you using variables or parameters for your paths or are they hardcoded? Have you tried using view data in the designer?
by ArndW
Mon Mar 30, 2009 12:39 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Flat file error
Replies: 13
Views: 3074

Try opening a DOS window and using the "cd {yourpath}" command - does that work? Are you using variables or parameters for your paths or are they hardcoded? Have you tried using view data in the designer?
by ArndW
Mon Mar 30, 2009 12:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transformer
Replies: 11
Views: 1994

The Transformer stage cannot be used for your intended purpose, because there is no way to detect that the last row is being read. As mentioned before, an Aggregator stage will do that.
by ArndW
Fri Mar 27, 2009 7:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: UtilityHashLookup Transform Function
Replies: 5
Views: 1841

Are you certain that the hashed file exists in the new Project? What parameters are you passing to the routine?
by ArndW
Fri Mar 27, 2009 7:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: UtilityHashLookup Transform Function
Replies: 5
Views: 1841

Are you certain that the hashed file exists in the new Project? What parameters are you passing to the routine?
by ArndW
Mon Mar 23, 2009 6:26 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Dsx export discrepancies
Replies: 4
Views: 1553

Did you install the same optional components on both systems? It looks like defaulted empty values are being added to one and not the other export files and it shouldn't make a difference in execution.
by ArndW
Tue Mar 17, 2009 11:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clone static hashed file
Replies: 4
Views: 1197

The only problem could be indices, if you have them on the files.
by ArndW
Sun Mar 15, 2009 6:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: add_to_heap error when reading from system-cache hashfile
Replies: 1
Views: 768

The CATALOG command puts routines in the system catalog, not hashed files. Or do you mean that you are using the FILE.CACHE commands?

The heap is a temporary space allocation, so the disk will quickly run full then empty again and you might have missed it.
by ArndW
Sun Mar 15, 2009 5:52 am
Forum: General
Topic: Error while saving Parallel jobs & Compiling
Replies: 4
Views: 2508

Have you tried doing a Save-As another name? If that works, delete the original job and then rename your copy. At V8 it is a bit more difficult to recover from this type of problem than at previous versions, as you also have the repository to deal with and the error could be in either place. Do you ...
by ArndW
Fri Mar 13, 2009 7:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Accessing MSAccess through DataStage
Replies: 3
Views: 1179

Yes, I've been successful using DataDirect drivers to a V7 DataStage on Solaris.
by ArndW
Fri Mar 13, 2009 7:07 pm
Forum: General
Topic: DataStage 8.0 dsjob command error
Replies: 11
Views: 11986

Don't you still need the "-mode NORMAL"?
by ArndW
Fri Mar 13, 2009 2:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How can we check all the node are working or not ?
Replies: 4
Views: 1766

Ernie - I know that there is, but I seem to recall it is different on different versions on Linux as well. Scope/krishna - The fork failed error is certainly caused by a setting of maxuproc too low, but a value of only 100 is far too little for PX to work; even a medium sized job with several nodes ...