Search found 15603 matches

by ArndW
Fri Jul 08, 2005 2:04 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Errors while processing data from a Sequential File Stage
Replies: 1
Views: 760

yaminids,

you have unbalanced quotes in line 3 of the file or a bad quote definition. Most likely you need to change your quote character definitions in the DataStage sequential file read. If you can't find the problem, just post line 3 to this thread and someone might be able to help.
by ArndW
Fri Jul 08, 2005 2:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Process killing
Replies: 6
Views: 2305

A Server restart is not always necessary, but it is the simplest method. Just remember that when killing UNIX processes never use a kill -9. If this happens to you often, please activate the deadlock daemon in DataStage and that way it will remove stale locks for you.
by ArndW
Fri Jul 08, 2005 1:30 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Trim the string with space.
Replies: 10
Views: 17577

Sorry, because the function has varaible number of parameters you will need to specify all 3 as the "A" option is the last one.

Use: TRIM(MyString,' ','A') and it will work.
by ArndW
Fri Jul 08, 2005 1:28 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Handling data in UTF8 Character Set
Replies: 1
Views: 779

hello dhletl, normally seeing a "?" in a DataStage NLS context does not necessarily mean that the value is really a question mark; many editors and view programs are not NLS enabled and will convert UTF-8 non-latin characters to a "?" in their output. This also applies to the DS view-data windows wh...
by ArndW
Fri Jul 08, 2005 1:22 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Trim the string with space.
Replies: 10
Views: 17577

The TRIM function has several options, in your case you would want to use the Trim (Arg1,"","A") syntax to remove all occurrences. You might want to consider using the REPLACE function as well.
by ArndW
Thu Jul 07, 2005 10:48 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: usage of Common block across transformer
Replies: 20
Views: 6269

Shantanu, see a bit earlier for the use of the $INCLUDE {file} {include} statement. I was surprised that you could skip the {file} part, but in the BASIC manual it states When program is specified without filename, program must be a record in the same file as the program currently containing the $IN...
by ArndW
Thu Jul 07, 2005 10:35 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Ongoing Performance Problems with DS 7x.
Replies: 1
Views: 676

David,

a "discreet connection" won't utilize a yahoo mail address.

or post this type of message in an inappropriate forum, with a misleading title.

Administrators -> PLEASE REMOVE THIS THREAD as it is duplicated in the looking for talent forum.
by ArndW
Thu Jul 07, 2005 9:22 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: usage of Common block across transformer
Replies: 20
Views: 6269

talk2shaanc, I'm glad you are confident that you can do it. I am not, I just wrote a test job and the COMMON is not shared between the job and it's after-job (or before-job) routine; these are different processes. So you will have to explicitly pass the values in through the after-job call or write ...
by ArndW
Thu Jul 07, 2005 8:59 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: usage of Common block across transformer
Replies: 20
Views: 6269

Craig,

bingo, I missed that.
by ArndW
Thu Jul 07, 2005 8:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: usage of Common block across transformer
Replies: 20
Views: 6269

You do not need to use $INCLUDE. just make sure that in both places you define the COMMON block with the same name and the same number of variables (yes, you can change the names around, common blocks are interpreted positionally). The manual might state that it makes sense to use $INCLUDE; the synt...
by ArndW
Thu Jul 07, 2005 8:36 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: usage of Common block across transformer
Replies: 20
Views: 6269

The COMMON block works wonderfully well when you are working within one process; but common is not shared across processes. So your method would only work if both transformers are actually in the same process. In Server this is the case, so it will work for (I just wrote a test job to make sure that...
by ArndW
Thu Jul 07, 2005 6:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash file writing slowing down suddenly
Replies: 11
Views: 4162

Viswanath, I'm sorry about getting the posts and posters mixed up; I thought this was the thread where a read was slowing down and we were trying to clear that up! I would let the file fill up to maximum size, use any of the utilities available to get the optimal group counts (HASH.HELP,ANALYZE.FILE...
by ArndW
Thu Jul 07, 2005 5:21 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash file writing slowing down suddenly
Replies: 11
Views: 4162

Viswanath, the 80-20 split and merge algorithm is usually correct; changing the point in time that a split gets done or a merge happens when large changes are happening to the file doesn't affect the actual time needed; it is best to set the minimum.modulus high so that a split doesn't need to happe...
by ArndW
Thu Jul 07, 2005 12:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: link information in parllel job
Replies: 6
Views: 1789

rgandra,

the DSGetJobInfo is not an array but is a function and needs to be declared before you use it. What you need to insert into your source code is the statement

Code: Select all

$INCLUDE DSINCLUDE JOBCONTROL.H


which defines that function call.
by ArndW
Thu Jul 07, 2005 12:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Wanna keep a set of jobs in loop
Replies: 6
Views: 1637

december786, if you put in one extra tier of sequence or job calls you can effect this. In your top job you just call one other inside the loop, and wait for return before processing the next record in your source file. In your second level job you call up all your detail processing jobs in sequence...