Search found 53125 matches

by ray.wurlod
Mon Apr 17, 2006 9:38 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Merge stage warning message
Replies: 6
Views: 6165

De-duplicate the inputs. This is a requirement for the Merge stage where there is more than one update input.

Make sure that the master input does not become exhausted while there are still update inputs to be matched (perhaps by re-assessing which is master).
by ray.wurlod
Mon Apr 17, 2006 9:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Different Input Records to One Record layout
Replies: 4
Views: 1260

Switch, Filter or Transformer stage to split into separate stream for each record type; Modify, Column Generator or Transformer stage to supply missing columns; Funnel stage to bring them all back together.
by ray.wurlod
Mon Apr 17, 2006 9:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Schema file
Replies: 9
Views: 55467

I note that you have also removed the surrounding square brackets.
by ray.wurlod
Mon Apr 17, 2006 9:31 pm
Forum: IBM QualityStage
Topic: getting error when I run a job in QUALITY STAGE DESIGNER
Replies: 9
Views: 4052

Check that you have permission to write into the D:\QProject\POC\Controls folder and access to all the parent folders in its path. A quick test is to get into a DOS shell (via cmd perhaps), CD to that folder and create a file, using the same user ID. It shouldn't matter, but can you try setting thin...
by ray.wurlod
Mon Apr 17, 2006 3:30 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: insert data into table that is being populated
Replies: 5
Views: 2079

It's unhelpful not having the exact error message. Chances are that you're trying to do too much with two few stages, or even too few jobs, and perhaps getting locking issues in the database. Staging the intermediate data, perhaps into a file as the previous poster suggested, is one way, but you can...
by ray.wurlod
Mon Apr 17, 2006 3:24 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: i enter year through parameter need all days to be dispalyed
Replies: 4
Views: 996

DataStage is not able to display anything, since it always runs background processes. I assume, therefore, that you want to generate 365 or 366 rows in a server job and have a separate date in each. Your job can be as simple as Transformer ----> SequentialFile Create a stage variable initialized to ...
by ray.wurlod
Mon Apr 17, 2006 3:18 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: architecture of 7.5
Replies: 15
Views: 4065

Still waiting for definition of "perfect answer".
by ray.wurlod
Mon Apr 17, 2006 3:17 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Cannot open executable job file RT_CONFIG623
Replies: 12
Views: 4226

This is unrelated to RT_LOG623 and is unlikely to be corrected by anything in DS.TOOLS.

Search the forum for this particular error message (leave off the particular job number). It has been fully explained in the past. You have a corrupted RT_CONFIG623 hashed file here.
by ray.wurlod
Mon Apr 17, 2006 3:15 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Timestamp
Replies: 4
Views: 2453

DataStage does not have "current timestamp"; it only has current time and current date. Do you want these to move along as rows are processed (in which case use Date() and Time() functions) or to remain constant for the job run (in which case use @DATE and @TIME system variables)? You then have to c...
by ray.wurlod
Mon Apr 17, 2006 3:12 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Passing environment variables into production
Replies: 5
Views: 1292

Project-based environment variables are stored in a text file called DSParams, which is located in the project directory. You can copy DSParams, or relevant lines from it, into DSParams in the production environment (project directory). If you wanted to script this it should be reasonably straightfo...
by ray.wurlod
Mon Apr 17, 2006 3:08 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: In Informix DataBase condtion for NULL value getting fail
Replies: 12
Views: 2884

Where is the source coming from? Just because you declare a column to be not null in your DataStage job does not enforce non-nullability. For example, if your source is a text file, the Format tab on the Sequential File stage specifies what will be interpreted to be null (by default it is "") and th...
by ray.wurlod
Mon Apr 17, 2006 3:03 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading in a binary file
Replies: 9
Views: 3954

Surely "they" produced the file using some kind of metadata or design. Get "them" to supply the file layout(s) to you. "Binary" is OK if the data are fixed-width columns. For example, an integer will occupy four bytes. You can declare it as Char(4) and use Oconv() to turn it into a string of ASCII d...
by ray.wurlod
Mon Apr 17, 2006 2:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Mapping in datastage
Replies: 12
Views: 3636

If it's a single mapping file, or even just few, for multiple data files (each having the same metadata), you could simply create a table definition, or more than one, manually in the Repository and use these as the metadata for various DataStage jobs to process the data. In this way you could avoid...
by ray.wurlod
Mon Apr 17, 2006 2:55 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: performance of hashed file
Replies: 4
Views: 1272

If the size of cache is adequate (no hashed file fails to be cached) then increasing the available size of cache will have no impact whatsoever on the "performance" (whatever that means) of hashed files.
by ray.wurlod
Mon Apr 17, 2006 2:52 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Installing DS 7.5.2 on a laptop to do PARALLEL PROCESSING!!
Replies: 7
Views: 1611

You can run parallel jobs (you need version 7.5x2). Obviously with only one CPU you rapidly run out of processing power. Don't contemplate it for a production environment.