Search found 39 matches

by kausmone
Wed Apr 16, 2008 6:01 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: difference between two timestamp fields
Replies: 1
Views: 1710

difference between two timestamp fields

Hello again, I have to compare a timestamp value (lets call it T1) available in a hashed-file with current timestamp (T2). If (T2 - T1) > 4 hours, then write the record to output file. Is there any function available that will convert timestamps to internal storage format? If I am to use Iconv, I wi...
by kausmone
Fri Apr 11, 2008 5:08 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error writing to pipe: Interrupted system call
Replies: 5
Views: 3981

The file's not too big, a few hundred records, about 500 kbytes. When I rerun, I actually rerun the entire process, so I am not sure if it is an access issue. And like I said, it doesn't abort all the time, only once in a while...
by kausmone
Fri Apr 11, 2008 4:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error writing to pipe: Interrupted system call
Replies: 5
Views: 3981

Thanks, I will try that.. but what exactly was the problem?
by kausmone
Fri Apr 11, 2008 1:21 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error writing to pipe: Interrupted system call
Replies: 5
Views: 3981

Another job in the same project doing a similar MLOAD aborted today with the same reason. And it executed successfully on a re-run. Has anyone encountered similar behavior before?
by kausmone
Thu Apr 10, 2008 9:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: insert to sqlserver
Replies: 5
Views: 2231

Ratna, Can you give more details please? Is this happening for all columns or for only one particular column etc? I am assuming that you have already checked and confirmed that the datatype of the source (Oracle) and the target (SQL Server) is the same for the column(s) that are showing this behavio...
by kausmone
Thu Apr 10, 2008 12:53 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error writing to pipe: Interrupted system call
Replies: 5
Views: 3981

Error writing to pipe: Interrupted system call

Hello, I have a server job performing a Teradata Multiload using TDMLoad stage. Source data is in a flat file. The job has been running fine for months, scheduled at a frequencey of every 10 minutes. Recently however, every once in a while it aborts with the below message Could not write row to data...
by kausmone
Wed Apr 02, 2008 1:34 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

If the freed space is going to be reclaimed by the hashed-file, it still serves the purpose, since my hashed-file is going to get appended to every hour or so. I think I should stick to this approach of deleting records 'physically' instead of logical deletes, where there will be no reclaimable space?
by kausmone
Tue Apr 01, 2008 8:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

Cool, thanks. Will do
by kausmone
Tue Apr 01, 2008 8:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

Thanks for the hint :)

For the record, the query needed to be:

Code: Select all

DELETE FROM "images_metadata_hashed" WHERE ("doc_key" = ? AND "event_key" = ?);
Can someone throw more light on whether or not I will be able to reclaim space this way?
by kausmone
Tue Apr 01, 2008 7:13 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

I had to rule that out because the size of my hashed-file would keep on increasing with logical deletes. My process might encounter about a million records a day. Now, when I try deleting, I am giving the query as below (user-defined) but it is not working. What is the correct way of specifying the ...
by kausmone
Tue Apr 01, 2008 6:44 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

Ah, ok, gotcha. I tried creating the original hashed file by using account name and when I accessed this using UV stage (with a SELECT query), it worked without any problems.

Thanks for your help Ray and Craig!

-kaus
by kausmone
Tue Apr 01, 2008 6:24 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

Thanks Ray. I tried with localuv after confirming that it is defined in the uvodbc.config file. I gave the hashed-file name in the Table Name field of the Outputs-> General tab. Now I'm getting error "table does not exist" JB_SVTT_Delete_HashedFile_UV..UniVerse_1.DSLink2: DSD.BCIOpenR call...
by kausmone
Tue Apr 01, 2008 6:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

Did you mean the UniVerse stage? I tried using the stage but am not sure what is to be filled in the Data Source Name etc. I tried filling in the name of the hashed-file (images_metadata_hashed) but it didn't work. I did a view data by putting in a SELECT SQL and got the following error: JB_SVTT_Del...
by kausmone
Tue Apr 01, 2008 3:22 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Delete specific records from hashfile / hashed-file
Replies: 18
Views: 10711

Delete specific records from hashfile / hashed-file

Hi there,

I have a flat-file A and a hashed file B. Records in A are a subset of records in B. I need to delete those records in the hashfile B that are present in flat-file A. Can anyone help me on how to go about it in a server job/routine?

Thanks,
Kaus
by kausmone
Wed Mar 05, 2008 4:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: job status 99
Replies: 3
Views: 2089

found the reason.. something wrong in my script.. sorry for the bother...

thanks,
kaus