Search found 53125 matches

by ray.wurlod
Tue Feb 05, 2008 3:53 am
Forum: General
Topic: from BCP to Oracle using DataStage
Replies: 9
Views: 5125

That's OK, it just means that the data browser doesn't know how to display Char(137). DataStage is probably reading it successfully, so you can convert the Char(137) back to whatever you like within the job, perhaps tab (Char(9)), perhaps "". Whatever is appropriate.
by ray.wurlod
Tue Feb 05, 2008 3:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to convert the server jobs into pX jobs
Replies: 2
Views: 672

There are two methods. 1. Analyze the logic of the server jobs and write parallel jobs to implement that logic. You could, of course, hire a competent consultant to do this. 2. Give IBM a lot of money to do the same for you. They call it (their "service") "conversion tools" but most of it is outsour...
by ray.wurlod
Tue Feb 05, 2008 3:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Could not check all operators because of previous error
Replies: 3
Views: 3493

You are not getting that error only. It never appears in isolation. What other warnings or errors have been logged? Please post the exact messages (copy/paste - do not transcribe and especially do not attempt to add your own interpretations).
by ray.wurlod
Tue Feb 05, 2008 3:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Received SIGPIPE signal caused by closing of the socket
Replies: 10
Views: 18483

Suman wrote: The problem in the job is coming because of '&' in the input file name.
What proof do you have for this assertion?
by ray.wurlod
Tue Feb 05, 2008 3:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: date format issue
Replies: 4
Views: 1354

You need a DateToString() function with a StringToDate() function as its argument. Supply appropriate date format strings to the two functions.
by ray.wurlod
Tue Feb 05, 2008 3:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Partitioning
Replies: 1
Views: 768

You seem to be confusing data partitions (~= processing nodes) with output links from a Filter or Transformer stage. Whatever you were to specify with DML expressions (in a Filter stage) will happen on every processing node - the data are already partitioned, and are being directed onto multiple lin...
by ray.wurlod
Tue Feb 05, 2008 3:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dataset Column width
Replies: 3
Views: 840

What did you change so that it works?
by ray.wurlod
Tue Feb 05, 2008 3:34 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Cannot preserve the partitioning of the parallel data set
Replies: 10
Views: 4594

Wherever you set ORACLE_HOME is a good place also to set NLS_LANG. Since you set ORACLE_HOME in the dsenv script for DataStage processes, that would be a good place for NLS_LANG too. Or you could set up NLS_LANG as an environment variable in the Administrator client, and therefore use different valu...
by ray.wurlod
Tue Feb 05, 2008 3:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Date conversion
Replies: 4
Views: 928

That's exactly what I mean to say. You could even leave it as Timestamp. They're the same to Oracle. You only need to worry about format if you are supplying string data and applying a TO_DATE function.
by ray.wurlod
Tue Feb 05, 2008 3:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pivot-PX
Replies: 5
Views: 2437

Possibly your Pivot stage is not properly installed or registered. I am not seeing the same issue. You are, I trust, using the parallel Pivot stage, not the server job Pivot stage?

What exact version of DataStage are you using, and on what platform?
by ray.wurlod
Tue Feb 05, 2008 3:24 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Conversion of 32 bit hashed files to 64 bit using a routine
Replies: 7
Views: 1491

You would need to remove the parent directory of DATA.30, which (if executed recursively) would take out DATA.30, OVER.30 and .Type30, and you need to remove D_hashedfile. Command = "rm -rf " : HashFileName : " && rm -f D_" : HashFileName Call DSExecute("UNIX&quo...
by ray.wurlod
Tue Feb 05, 2008 3:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: BULK LOADER 8.1
Replies: 1
Views: 614

WHEN are you getting this error (for example during design, during compilation, when requesting job execution, during job run)? Is there any more to the error message? Are there any additional error messages? Load Mode is one of the properties of this stage. Have you set its value and, if so, to wha...
by ray.wurlod
Tue Feb 05, 2008 3:16 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Weird Hash File problem
Replies: 3
Views: 1026

For starters, welcome aboard. Please note that correct terminology is hashed file, not hash file. Do you have caching enabled on your lookup? What happens if you remove the lookup from the job? What else is happening in the Transformer stage? Are there any warnings or errors logged?
by ray.wurlod
Mon Feb 04, 2008 11:26 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dataset Column width
Replies: 3
Views: 840

I'm not aware of any documented limit on LongVarChar, and would be very surprised if it were as small as 4KB (because a VarChar can be that big). Suggestion: try VarChar(16384) rather than LongVarChar(16384) When, precisely, does this error occur? It is certainly possible to load VarChar(16384) or L...
by ray.wurlod
Mon Feb 04, 2008 11:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Pivot-PX
Replies: 5
Views: 2437

Please be more exact about which stage types you have used in your design. We need to be precise about which particular "source stage" can not have an output link. Using version 7.5.1A on Linux the following job design does not generate the error you report. SequentialFile -----> Pivot -----> Sequen...