Search found 15603 matches

by ArndW
Tue Nov 23, 2010 9:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Find Number of years between 2 dates.
Replies: 11
Views: 13750

The function DaysSinceFromDate will give you the number of days between the two dates. Divide by 365 and you should be good to go. Does the one day, 0.00274 of a year, really make a difference if you round to one decimal place?
by ArndW
Tue Nov 23, 2010 9:40 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to read BODY data by using sequential file
Replies: 3
Views: 1585

Craig's solution is even simpler and more foolproofer :P
by ArndW
Tue Nov 23, 2010 8:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: GET FILENAME
Replies: 37
Views: 9673

He means that job parameters could also be looked at as having a global scope, but since their values cannot be changed in the job at runtime they are not really variables, but more like constants.
by ArndW
Tue Nov 23, 2010 8:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to read BODY data by using sequential file
Replies: 3
Views: 1585

Declare your source file to have just one column, an unterminated VarChar. If "INDEX(In.BigColumn,',',1)" is true then the line has one or more commas and then you can parse the columns using "FIELD(In.BigColumn,",",1)", "FIELD(In.BigColumn,",",2)", ...
by ArndW
Tue Nov 23, 2010 8:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Logic in transformer required
Replies: 4
Views: 1571

While one can store values from previous records using stage variables, one cannot change output value from a previous record depending on a computation in the current one. You will need to re-sort your data in descending key-column in order to make this type of processing work.
by ArndW
Tue Nov 23, 2010 8:39 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to reference input columns listed in schema file
Replies: 7
Views: 2870

Just declare "dollar_cd" in the input link of the transform stage with the appropriate data type length and then you can work with that column, assuming that it is being delivered via RCP. You can check this by turning on OSH_PRINT_SCHEMAS to see which columns are being pushed through each...
by ArndW
Tue Nov 23, 2010 8:33 am
Forum: General
Topic: Backup Automation
Replies: 3
Views: 1573

I mentioned the "istool" because I think one can do category-level exports from the command line but can't confirm that right now - it is documented in pdfs.
by ArndW
Tue Nov 23, 2010 8:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sftp ing the files and deleting files on remote server.
Replies: 17
Views: 11644

I recall reading that sftp doesn't allow remote file actions such as delete, just local ones, so I don't think that this can be done, but it would certainly be worth a try - at worst there will be some error message and best case would be a solved problem!
by ArndW
Tue Nov 23, 2010 8:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Logic in transformer required
Replies: 4
Views: 1571

You have a variable called @INROWNUM that gives you the input row number, or you can use the stage variable "initial" value to give you a value for the first row. In your case you need to use values from the previous row, so you will need to store these in stage variables. I am not quite s...
by ArndW
Tue Nov 23, 2010 6:30 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex flat file reading
Replies: 5
Views: 1791

a filename ending in ".dat" doesn't real many anything to anyone here - it can be any format. Even simple files can be read with the complex flat file stage, just point the stage at the file and define your columns.

What have you tried so far and what were your problems?
by ArndW
Tue Nov 23, 2010 6:04 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Count Of records
Replies: 12
Views: 3093

DataStage stages are designed so that they execute one row at a time, so your requested approach poses some difficulties so that it doesn't get executed once per row. The best was has alread been suggested, to get the count in a job sequence and then pass that value as a parameter to your job. In yo...
by ArndW
Tue Nov 23, 2010 3:26 am
Forum: General
Topic: Backup Automation
Replies: 3
Views: 1573

Hello Balu6 and welcome to DSXChange. As part of the signup process and before posting to a new forum one should follow the recommendations, amongst which is this post The Search function is indeed powerful and entering "backup by category" or "istool" or merely "backup"...
by ArndW
Tue Nov 23, 2010 2:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sftp ing the files and deleting files on remote server.
Replies: 17
Views: 11644

in pull mode you can't execute remote commands so you will have to "ssh" the delete command some other way in your job. Either as an after-job shell call or from the calling job sequence.
by ArndW
Tue Nov 23, 2010 2:29 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Return codes for controlled looping in Data Stage 8.1
Replies: 1
Views: 2497

What is not working in the job sequence checking for the return code? That functionality is often used and unlikely to be buggy, perhaps your shell script / SQL code is not returning a non-zero return value? Since you can read stdout into a variable in your sequence you might be able to parse that s...
by ArndW
Tue Nov 23, 2010 2:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading Multiple record types in a file using CFF stage
Replies: 4
Views: 2462

I'm currently not at a DataStage client where I can post the exact steps, perhaps someone else might post the steps or you could look at the documentation for the CFF where the procedure is described. Just as in a COBOL program, you use one field to determine which record type is being used in that ...