Search found 3 matches

by pse021
Fri Apr 07, 2006 5:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: IO system issue
Replies: 1
Views: 769

The amount is higher for 2 or 3 jobs, which could be considered as "critical". For example, one of the most critical log file is for a job which parse cobol file and each run does it for one file, the average of files per minute is around 5. Per run, 10 lines is written (2 for the auto-purge because...
by pse021
Wed Dec 14, 2005 10:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error with dynamic hashfiles
Replies: 3
Views: 874

Hi Kenneth.

We will try to increase the number of open hashfiles by modifying the UniVerse config file.

In the same time, we will review the Control Jobs to check if the DSDetachJob is well executed at the end of each job execution.

Many thanks for your quick reply,
Sebastien
by pse021
Wed Dec 14, 2005 9:29 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error with dynamic hashfiles
Replies: 3
Views: 874

Error with dynamic hashfiles

Hi All, I have got the following error in my job log, I have difficulty to eliminate: Program "DSD.UVOpen": Line 456, Unable to allocate Type 30 descriptor, table is full. It happens only from time to time. On the second job run, after the error, few seconds later, the problem does not occur. The ha...