Search found 53125 matches

by ray.wurlod
Tue Mar 27, 2007 12:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading Dynamic Vector of SubRecords
Replies: 8
Views: 3740

You are close. Try something like record {final_delim=end, delim=',', quote=none} ( Key:int32 {delim=','}; Col3:string[max=10] {delim=',',link}; rec[]:subrec {reference='Col3'} ( Col1:string[max=10] {delim=':'}; Col2:string[max=10] {deli...
by ray.wurlod
Tue Mar 27, 2007 12:43 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: IF THEN END IF
Replies: 3
Views: 1535

In an expression there is never an END - this is only used within a Routine to mark the end of a block of statements beginning with THEN or ELSE (or ON ERROR or LOCKED). What exactly are you trying to accomplish? You can certainly nest IF..THEN..ELSE constructs in an expression (to a certain level o...
by ray.wurlod
Tue Mar 27, 2007 12:39 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DRS Stage Problem
Replies: 8
Views: 1559

"hanged" or "hung"? :lol:
by ray.wurlod
Tue Mar 27, 2007 9:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DSJobStartTime
Replies: 4
Views: 2775

Create a user variables activity ahead of Job 1. Catch the server time in a variable. Make this the last variable in the grid, and its time will be close enough to the start time of Job 1. Pass a reference to this user variable as the parameter value. Do you need a timestamp or just a time? Oconv...
by ray.wurlod
Tue Mar 27, 2007 8:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading Dynamic Vector of SubRecords
Replies: 8
Views: 3740

This is very silly.

The link field must be at the same level as the subrecord field, not at the same level as its elements.

You can edit the levels in the record schema on the columns grid, it's probably easier to do it there than in the record schema itself.
by ray.wurlod
Tue Mar 27, 2007 8:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Optimal settings to create a large Hashed file
Replies: 4
Views: 1061

DataStage reports rows flowing when they are flowing. If they are flowing into the cache, fine, you get good rates. But the clock keeps running when the rows are being flushed to disk, even though no more rows are flowing. So the rate appears to diminish. Optimal depends primarily on the combination...
by ray.wurlod
Tue Mar 27, 2007 8:13 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Logic for combination
Replies: 3
Views: 1251

"Fork join" design. Copy stage to generate two copies of the input, then Join stage to re-join them based on key.
by ray.wurlod
Tue Mar 27, 2007 7:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading dataset with different configuration
Replies: 3
Views: 1232

Best practice is always to use the same configuration file with a Data Set as the one with which it was created.
by ray.wurlod
Tue Mar 27, 2007 6:59 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Modify Stage and dataset view problems
Replies: 1
Views: 633

When it "pops up an error message" and invites you to view more, answer Yes and inform yourself about the actual cause of the error.
by ray.wurlod
Tue Mar 27, 2007 6:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading Dynamic Vector of SubRecords
Replies: 8
Views: 3740

I think you may need to define the link field and refer to that link field in the record schema. You can also do this by editing the columns' extended properties in the Columns grid (right click, Edit Row). Since your data do not contain the link value (number of "rows" in vector of subrecords) you ...
by ray.wurlod
Tue Mar 27, 2007 6:54 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: 2GB Limit on Hash File regarding blink error
Replies: 10
Views: 1825

You're wrong. Splitting the data stream will not overcome any 2GB storage limit, unless two separate hashed file stages are used, referring to two separate hashed files. Then there's no guarantee that a key being looked up will be in the correct processing stream.
by ray.wurlod
Tue Mar 27, 2007 6:52 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to Concatenate Data from active stages in server 6.0
Replies: 5
Views: 1248

You can put a sequential file stage or IPC stage between two active stages to overcome the "no active to active" rule.
by ray.wurlod
Tue Mar 27, 2007 6:50 am
Forum: General
Topic: Link Count
Replies: 27
Views: 10932

You can not. @INROWNUM is a system variable only accessible in a Transformer stage, either upstream or downstream of your Aggregator stage.
by ray.wurlod
Tue Mar 27, 2007 6:48 am
Forum: Information Analyzer (formerly ProfileStage)
Topic: It's checking for analysis server in local mechine
Replies: 2
Views: 2621

Try putting the host IP address into the Host IP Address field rather than the host name field.
by ray.wurlod
Mon Mar 26, 2007 9:08 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading Multiformat Records
Replies: 1
Views: 812

Method 1. Define a four-column record schema, using which you read the file using a Sequential File stage that has a reject link. INS records will pass, DEL records will not, and be sent down the reject link. There place a Column Import stage to re-parse the raw string with a two-column record schem...