Server Jobs - record processing
Posted: Thu Sep 15, 2005 7:25 am
I have 10 records with 2 columns each in my input sequential file like,
A1 B1
A2 B2
A3 B3
A4 B1
A5 B5
I want to generate Surrogate Key for each of the column (should result in 09 surrogate key) using hash file lookup. According to my understanding when we are using a dynamic hash file records get cached for every record in the memory, however it is not happening so.... in the same input file when I have duplicate col2 for some records, the lookup is failing and I could record the same with a flag.
Wish to know, even in Server Jobs records are not processed row by row but in bunch.
Thanks,
Manikandan
A1 B1
A2 B2
A3 B3
A4 B1
A5 B5
I want to generate Surrogate Key for each of the column (should result in 09 surrogate key) using hash file lookup. According to my understanding when we are using a dynamic hash file records get cached for every record in the memory, however it is not happening so.... in the same input file when I have duplicate col2 for some records, the lookup is failing and I could record the same with a flag.
Wish to know, even in Server Jobs records are not processed row by row but in bunch.
Thanks,
Manikandan