Page 1 of 1

Job from a Query

Posted: Thu Jan 11, 2018 12:42 pm
by nibumathewbabu
Hi all,
I got requirement to change the below existing query in to a DS Job

PFB the sample query

INSERT INTO Table_Name (A, B, C, D, E, F, G, H)
VALUES ((SELECT MAX(id) + 1 FROM id_table), '', 5, 'ABBEY', '', 1, 1, 'EFG');


How can I achieve inserting the first value(select query from a table) using the DS job.
Kindly help

Posted: Thu Jan 11, 2018 12:45 pm
by chulett
Your source would be the "SELECT MAX(id) + 1" from the table noted and then you would use another stage (say, a transformer) to add the other hard-coded values to the row before sending it to the target.

Posted: Thu Jan 11, 2018 3:01 pm
by nibumathewbabu
Thanks Craig for the swift response but there is a small change in requirement

if you see the above query, the value for field A should come from select query, B,C,D,E,F Values will come from a source file
columns G,H will be again from transformer(Hard coded)

The whole idea is there are some fields from an input file they need to add some more fields which static value(hard coded from transformer) and a key value for the row,(which is our select maxid query)

Kindly suggest,

Posted: Fri Jan 12, 2018 10:00 am
by UCDI
you will need to take care here due to parallel processing. if you query the table and get max is 100, then try to insert 10 records on 4 nodes, you will then insert 101 4 times, 102 4 times, and 103 2 times if you don't handle the parallel logic properly.

Posted: Fri Jan 12, 2018 3:04 pm
by chulett
For multiple records that need to start with MAX+1 and (I assume) keep incrementing with each row, I'd suggest a different approach. I'd look into sending that current MAX value into the job as a job parameter via a values file, store it in the Initial Value of a stage variable and increment it for each row.

And, of course, take care here due to parallel processing. Heck, maybe even use a Server job for this, nothing about this sounds like anything that needs to be Parallelly processed. :wink: