Search found 6797 matches

by DSguru2B
Fri Feb 09, 2007 3:32 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Run Multiple Instances
Replies: 25
Views: 7993

You can have a before job subroutine "ExecSh" and put a sleep 5. This will ensure that the job will start and wait for 5 seconds before running the main part of the job.
Also, to run multiple instances from command line, you need to suffix the job name with dot instance name(.x)
by DSguru2B
Fri Feb 09, 2007 3:07 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transformer Stage Functions
Replies: 26
Views: 7014

If you are using a server job or a basic transformer then do this If (in.Col Matches "...V-0N1N..." OR in.Col Matches "...V0N1N...") then Field(EREPLACE(in.Col, "Valve", "")[INDEX(EREPLACE(in.Col, "Valve", "")...
by DSguru2B
Fri Feb 09, 2007 2:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Calling Environment Variables in Basic Transformer
Replies: 9
Views: 1842

Thats ok buddy. Glad you got it resolved. Just mark it as resolved and have a cup of tea, or coffee :wink:
by DSguru2B
Fri Feb 09, 2007 2:23 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading a Sequential File using Execute Command Stage
Replies: 7
Views: 1410

How are your routine writing skills?
by DSguru2B
Fri Feb 09, 2007 2:22 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Passing Parameters through dsjob function
Replies: 12
Views: 5512

-local option is required if your Job needs to take environmental variables. -local might be required to pick up environment variables. Granted. I dont use environment variables. Hence cannot comment more on it. -mode NORMAl option is not required if you just want to run the job. -mode option is op...
by DSguru2B
Fri Feb 09, 2007 2:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Calling Environment Variables in Basic Transformer
Replies: 9
Views: 1842

gsym wrote:So I declare all the environment variables in the job properties>job parameters and both my start dates and end dates are specified there as environment variables.


Well if you already have them setup in the job parameters then why not read them inside the transformer.
by DSguru2B
Fri Feb 09, 2007 1:30 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Calling Environment Variables in Basic Transformer
Replies: 9
Views: 1842

You pass the environment variable as a job parameter.
by DSguru2B
Fri Feb 09, 2007 1:28 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Passing Parameters through dsjob function
Replies: 12
Views: 5512

parimi123 wrote:Did you change working directory to your project directory before executing dsjob
Poorna

I highly doubt thats required Poorna.
by DSguru2B
Fri Feb 09, 2007 11:04 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: In HashedFile,how to get first row
Replies: 9
Views: 3029

If the entire record is identical then you dont need to worry about which one is first or last.
If only the keys are identical and you want to retain the first one then use the aggregator method as I specified.
End of story.
by DSguru2B
Fri Feb 09, 2007 11:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: In HashedFile,how to get first row
Replies: 9
Views: 3029

Group by your key and provide FIRST as the derivation for all the rest of the columns. This will ensure that you get the first row for a duplicate key.
by DSguru2B
Fri Feb 09, 2007 10:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Store User Names, Passwords, DataSource Names
Replies: 20
Views: 6010

You can define them as environment variables in the Administrator and use $PROJDEF in their place inside your stages. This way all you need to do is change the values in the administrator and the jobs will pick up the correct value during runtime.
by DSguru2B
Fri Feb 09, 2007 10:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: teradata enterprise with unknown destination columns
Replies: 21
Views: 6850

What about the dataset? Are all fields defined as notNull there as well?
by DSguru2B
Fri Feb 09, 2007 10:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Difference between a Filter stage and Switch Stage
Replies: 5
Views: 7073

A few major differences: -Filter stage can have any number of output links where as Switch stage is limited to a max of 128 links. -Filter stage can optionally have a reject link. Switch stage requires a reject link. -A switch stage is like the C swtich function. It goes through all the cases and if...
by DSguru2B
Fri Feb 09, 2007 10:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: In HashedFile,how to get first row
Replies: 9
Views: 3029

Feed the input to an aggregator, grouped by on your key and retaining the first value. Output this to a hashed file.
by DSguru2B
Fri Feb 09, 2007 10:09 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Generate numeric duplicate number column
Replies: 6
Views: 1281

You can do this with stage variables. Use hash partition to ensure that similar keys end up in one partition. Use the stage variables to compare present row with previous row and a increment another stage variable which will be used as your counter. Reset it back if there is a change. You need to ta...