Search found 53125 matches

by ray.wurlod
Tue Jul 18, 2006 1:59 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Passing UTF-8 data thru Named Pipes
Replies: 2
Views: 2109

It's not just a workaround; it's the appropriate technique for guaranteeing that what comes out of the pipe is precisely what goes in. With any other map you run the risk of unmappable characters being encountered. No-one wants to, needs to, or even can access the rows while they're in the pipe, so ...
by ray.wurlod
Tue Jul 18, 2006 1:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help on tuning the job
Replies: 2
Views: 1255

An IPC stage, or even inter-process row buffering, will only help if the machine has spare CPU capacity. Other than that, is there a particular requirement for using DRS? Have you tried a comparison using ODBC stage? It's a thinner interface to the same ODBC driver used in each case. Do try the benc...
by ray.wurlod
Tue Jul 18, 2006 1:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Concatinating three columns
Replies: 6
Views: 1354

But it IS a different question, because...
by ray.wurlod
Tue Jul 18, 2006 1:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Removing duplicates from 20 million records
Replies: 24
Views: 7528

QualityStage can perform single file unduplication (even using fuzzy matching criteria) as well as two-file matches and removal of duplicates therefrom using various strategies.
by ray.wurlod
Mon Jul 17, 2006 9:11 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: concat
Replies: 4
Views: 1039

This is more complex as it seems. In case 2, there is a missing value, so two delimiters are required. This is where some dynamic array functions or a routine will be useful. You need to append empty values until each new item is found, and append that. With intervening delimiters. Assuming that Fie...
by ray.wurlod
Mon Jul 17, 2006 7:19 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Transformation Rules/Logic
Replies: 12
Views: 2337

col_code is not the key to the hashed file.

Therefore it can NOT be used for lookup via the Hashed File stage.

You need to create a hashed file in which col_code IS the key column.
by ray.wurlod
Mon Jul 17, 2006 5:10 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Fixed width columns on the target file.
Replies: 7
Views: 2772

Define additional, intermediate Char(1) columns and set the value of each to ",". Generate imaginative names (like Filler1, Filler2, or Comma1, Comma2 and so on) for these columns. In the Sequential File stage that writes the file, specify fixed-width format. Use the Display field in the Columns gri...
by ray.wurlod
Mon Jul 17, 2006 5:07 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Lookup
Replies: 1
Views: 530

Two lookups. One to a table or hashed file populated from the CFF (since lookups against the CFF directly are not permitted), the other to T2. Set up your reference key expressions (three keys on each lookup) appropriately. The results of each lookup can be constrained - if required - on the output ...
by ray.wurlod
Mon Jul 17, 2006 5:03 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Transformation Rules/Logic
Replies: 12
Views: 2337

To make it clear WHY they advised thus, a hashed file can only be looked up using an exact match on its key column(s). You are trying to do a "reverse lookup", which is not supported by the Hashed File stage (for the reason above). You could use a UV stage to perform a reverse lookup against your ha...
by ray.wurlod
Mon Jul 17, 2006 4:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Revisit : Duplicate
Replies: 6
Views: 1055

Could this be designed with three consecutive Transformer stages, with the intermediate links carrying extra columns containing the results of the upstream lookup(s), and the final decision being made in the third Transfomer stage? I haven't given it a great amount of thought but, in your place, I'd...
by ray.wurlod
Mon Jul 17, 2006 4:54 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: hash file size more than 2 GB
Replies: 3
Views: 897

If you insist on hashed files > 2GB as your chosen design, then you will need to create those hashed files with 64-bit internal pointers (or resize them so that thay have 64-bit internal pointers). This has been discussed many times, you can search the forum for how to accomplish either task. By the...
by ray.wurlod
Mon Jul 17, 2006 4:51 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Handling of EDI File format
Replies: 3
Views: 1473

Of course it can, if you know the format. You can read each line as a single VarChar, and process (segregate?) the different line types via a Transformer stage. The rest is up to you.
by ray.wurlod
Mon Jul 17, 2006 4:50 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Redbrick ODBC
Replies: 2
Views: 830

Welcome aboard. :D Unlike most ODBC data sources, DataStage does not look for the Red Brick driver in the usual way. You will need to have the environment variables RB_CONFIG and RB_HOME set. You were right to copy the Red Brick libraries to the new DataStage server, but you need to set all needed R...
by ray.wurlod
Mon Jul 17, 2006 4:40 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Can we run a DS Job not sequence from Mainframes
Replies: 10
Views: 2002

There are two choices. The Enterprise MVS Edition generates COBOL and JCL, which are transferred to the mainframe and executed there. These are only "mainframe" jobs, quite different from both server and parallel jobs. The other choice is UNIX Support Services (USS) in which part of the mainframe is...
by ray.wurlod
Mon Jul 17, 2006 4:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Writing correct English in this forum when asking questions!
Replies: 21
Views: 5302

mctny wrote:Thank you guys, I guess I am the only Turkish person here, :)


Halit also is Turkish, though based in Australia