Error with dynamic hashfiles

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
pse021
Participant
Posts: 7
Joined: Thu Mar 07, 2002 2:12 am
Location: Switzerland
Contact:

Error with dynamic hashfiles

Post by pse021 »

Hi All,

I have got the following error in my job log, I have difficulty to eliminate:
Program "DSD.UVOpen": Line 456, Unable to allocate Type 30 descriptor, table is full.

It happens only from time to time. On the second job run, after the error, few seconds later, the problem does not occur.

The hashfile is small, no file system is full, no pb on CPU side. I did not see anything wrong on the system.

Any idea?

Thanks in advance,

SP
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Research the forum for T30FILES and MFILES discussions, your problem is that the system was intantaneously out of ability to open/work with more dynamic hashed files.
Last edited by kcbland on Wed Dec 14, 2005 10:43 am, edited 1 time in total.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
pse021
Participant
Posts: 7
Joined: Thu Mar 07, 2002 2:12 am
Location: Switzerland
Contact:

Post by pse021 »

Hi Kenneth.

We will try to increase the number of open hashfiles by modifying the UniVerse config file.

In the same time, we will review the Control Jobs to check if the DSDetachJob is well executed at the end of each job execution.

Many thanks for your quick reply,
Sebastien
ray.wurlod
Participant
Posts: 54595
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The exact problem is that, when the error occurred, you already had as many dynamic hashed files open as permitted by the T30FILES tunable.

You need to increase T30FILES to prevent recurrence of this problem.

Remember that most of the Repository tables are dynamic hashed files; these are included in the count.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply