Hash File Write failed for record id

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
logic
Participant
Posts: 115
Joined: Thu Feb 24, 2005 10:48 am

Hash File Write failed for record id

Post by logic »

Hi,
One job in my sequence just reads a source table and writes to a hash file. While reading records from source table Only records for past 100 days are selected by a where clause. All the jobs give results as expected even the number of records written to the hash file are as expected. Still the job and therefore the sequence fails with the following error

Code: Select all

CW_HR_delPS_CW_TA_CARD_HDR_Source..HSH_PS_CW_TA_CARD_HDR_Source.Lkp_PS_CW_TA_CARD_HDR: WriteHash() - Write failed for record id '59812
2003-11-15 00:00:00'
The where clause is

Code: Select all

END_DT BETWEEN SYSDATE - 100 AND SYSDATE
What I do not understand is why would the record for 2003 be selected if i have given the where clause as above? and why would the write to hash fail?
any comments will be helpful pls.
Thanks,
Ash.
pnchowdary
Participant
Posts: 232
Joined: Sat May 07, 2005 2:49 pm
Location: USA

Post by pnchowdary »

Hi,

What database are you extracting the records from?
Thanks,
Naveen
logic
Participant
Posts: 115
Joined: Thu Feb 24, 2005 10:48 am

Post by logic »

Oracle using DRS.
....and intrestingly the job does not generate any warning if i run it stand alone.
Thanks.
pnchowdary
Participant
Posts: 232
Joined: Sat May 07, 2005 2:49 pm
Location: USA

Post by pnchowdary »

Hi,

Try running the same SQL in oracle and see whether it still selects that 2003 record ?
Thanks,
Naveen
logic
Participant
Posts: 115
Joined: Thu Feb 24, 2005 10:48 am

Post by logic »

no it doesnt ...
logic
Participant
Posts: 115
Joined: Thu Feb 24, 2005 10:48 am

Post by logic »

Hi,
When I executed the job once again it didnt generate the error. Anyway,I am just going to remove the job from the sequence to be on safer side. Since the number of records is not much and I am using this job to create a lookup for the next job,i guess i will eliminate this job .Performance is not a issue here.
Question still remains that why it selected the 2003 record :x ???
Thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The failure to write to the hashed file is usually indicative of a permissions problem (in which case you can't write ANY record to the hashed file) or a corrupted group (page) within the hashed file.

Usually you delete and re-create the hashed file on each run, so that the newly-created hashed file should be OK, but if you have a bad spot on the disk you still might see the problem. Perhaps on a different page.

As to how the date has become 2003, you need to trace the Transformer stage (Tracing tab in Job Run Options) capturing in-and-out data, and/or to run in Debugger. Is there, for example, a transformation applied to the date before it becomes a key column in the hashed file?

You might also like to send the output to a text file which you can examine with a hex editor, for example to determine whether there is any non-printing character forming part of the data.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
logic
Participant
Posts: 115
Joined: Thu Feb 24, 2005 10:48 am

Post by logic »

Hi,
My apologies for posting this topic in the wrong forum.
No. There is no transformation being applied to the date.
The date did not become 2003 during transformation..rather it was selected from the source inspite of there being a where clause. On rerun the 2003 date was not selected..nor is it being selected when I run the job now..So I dont know how I can trace the input and outout at transformer stage. Is there any way to trace what had happened in the previous run?
I wrote the data in text file and examined it...loks fine.
Thanks
Triton46
Charter Member
Charter Member
Posts: 83
Joined: Fri Feb 07, 2003 8:30 am

Post by Triton46 »

I've got this error and it appears to be permissions related. I (owner) can create/drop the hash file but my subordinate (user with group permissions) cannot. How do I rectify?
Triton46
Charter Member
Charter Member
Posts: 83
Joined: Fri Feb 07, 2003 8:30 am

Post by Triton46 »

NewDDWHash:
total 42
drwxrwsr-x 2 myaccount dstage 96 Nov 3 2005 ./
drwxrwsr-x 152 dsadm dstage 5120 Oct 6 15:44 ../
-rw-r--r-- 1 myaccount dstage 0 Nov 3 2005 .Type30
-rw-r--r-- 1 myaccount dstage 12288 Oct 6 05:30 DATA.30
-rw-r--r-- 1 myaccount dstage 4096 Oct 6 05:29 OVER.30

I tried to "chmod g+rw *Hash* but the DATA.30, .Type30 AND OVER.30 are unaffected.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

try "chmod -R 775 NewDDWHash" for a recursive change.
Triton46
Charter Member
Charter Member
Posts: 83
Joined: Fri Feb 07, 2003 8:30 am

Post by Triton46 »

That worked. Thanks!
Post Reply