Search found 15603 matches

by ArndW
Thu Jul 26, 2007 9:20 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Regarding unzip in WinNT
Replies: 24
Views: 8223

gunzip will correctly decompress the file on windows if you have it installed. You are most likely seeing an issue because UNIX file line termination is usually <LF> while windows default is <CR><LF>. If that isn't your problem, perhaps you could explain why you expect different numbers of rows.
by ArndW
Thu Jul 26, 2007 9:01 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: conversion from bigint to string
Replies: 3
Views: 2318

That is a conversion that will happen implicitly in PX jobs, but you can also explicitly convert using the DecimalToString() function,.
by ArndW
Thu Jul 26, 2007 8:51 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Key Expression in transformer for Look Up
Replies: 7
Views: 2405

The "OR" condition is not supported in the key expression. You will need to do both lookups and then do the OR logic in the derivations of the output. Unfortunately conditional lookups don't exist in server jobs. If you are doing lookups onto a relational source you could use user-defined SQL to do ...
by ArndW
Thu Jul 26, 2007 8:12 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: reading a zip file
Replies: 1
Views: 644

Use an external command stage with the command "gunzip {filename} -". That way you don't need to store the expanded file on disk and won't get the disk full condition (at least not from unzipping, you might get it when you write your output data)
by ArndW
Thu Jul 26, 2007 7:29 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage EE with windows
Replies: 20
Views: 8558

welkh - In order to post a question you should a) use search to see if it has been asked before b) if it hasn't been asked or answered, decide which forum to post in c) go to that forum and post a meaningful subject line and include all necessary details in your problem description. You should also ...
by ArndW
Thu Jul 26, 2007 5:30 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Controling File size in a Sequence Loop
Replies: 4
Views: 1103

The sequential file stage in server jobs will read the whole file, you cannot have it read just a portion. If you really need to limit yourself to a given number of rows per execution and you wish minimize passes through your source file then I would suggest you look into the split command to do thi...
by ArndW
Thu Jul 26, 2007 5:13 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Looking for some advice on Preload file to memory
Replies: 24
Views: 7071

The whole hashed file is loaded into memory before the first row is read.
by ArndW
Thu Jul 26, 2007 1:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: cancelling out from "Find /view data"
Replies: 3
Views: 1202

There is no clean way of breaking out of this. I've been frustrated many a time when I made a metadata error and ended up waiting 10-15 minutes for a "view data" to return the 1st row back to me but needs to read to the end of a large file. This would make a good enhancement request.
by ArndW
Thu Jul 26, 2007 12:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Additional sequential warnings
Replies: 12
Views: 4901

I don't know where you can change the value you are looking for. I haven't found it in the ENV variables or documented anywhere. If all warnings to go to the log file it might slow the job speed from "fast" all the way down to "glacial" or even slower, so it would be best to cater for this type of w...
by ArndW
Thu Jul 26, 2007 12:53 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Handling multiple commands inside FilterCommand in Seq File?
Replies: 15
Views: 5653

The format of a zipped file is binary. Various algorithms including Lempel-Ziv-Welch are used that take advantage of the fact that different byte or group of byte combinations occur with varying frequencies within a file and that not all 256 possible values in a byte are used throughout the file (hu...
by ArndW
Thu Jul 26, 2007 12:35 am
Forum: General
Topic: Rearding interprocess stage
Replies: 5
Views: 1215

Hello manojbh31 and welcome to DSXChange. In order for anyone here to help on this problem you will need to supply some additional information. I assume that you are using the "Interprocess Stage" in your server job. The number of rows that go into this stage will be the same as the number coming ou...
by ArndW
Wed Jul 25, 2007 9:30 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Archiving DataStage PX DataSets
Replies: 6
Views: 3937

You need to set up your environment correctly in order to do that. The $PXHOME/bin directory must be part of your PATH environment variable, otherwise specify the full path to that program. You will also need to specify an APT_CONFIG_FILE in your environment and running $DSHOME/dsenv should set up a...
by ArndW
Wed Jul 25, 2007 9:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Generate the .csv file with date or timestamp
Replies: 14
Views: 9245

I would use a User Variables Activity stage to create a variable with the full command line for the dsjob command, including using the aforementioned function to generate the date as a parameter to that job. Then use the variable in the subsequent command activity stage.
by ArndW
Wed Jul 25, 2007 6:29 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Generate the .csv file with date or timestamp
Replies: 14
Views: 9245

Do you want the timestamp to be part of the filename or part of the file contents?
by ArndW
Wed Jul 25, 2007 7:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Getting dropped
Replies: 9
Views: 1802

That derivation is why you are getting dropped records. You need to handle nulls in that derivation as well as in the one you posted earlier. You need to add a IF IsNull... to that clause.