....and hence my suggestion of reading it once, getting it transformed and loading into a flat file which has identical meta data as their respected tables. One hit on table, six flat files for six tables, even great for restartablity.
I think what oacvb was trying to say is
"Exactly, As DSGuru mentioned, Shell's within a job are called for some other purpose not to control jobs. Else it will be difficult to control jobs."
And at the unix level you can detect for job failure and get the FATAL error message by using dsjob -logsum with TYPE as FATAL, I believe. Store that in a file and then send an email using mailx.
If its the case as Craig mentioned, when you get compilation error, click on 'Show Stage' on the window. The link will become highlightened and therefore visible.
If the top 10% is to be calculated from a bunch of values (column values) in a single row, then you can enhance the functionlity that's been provided in your median query.
Performance is a relative term. A job that runs for half an hour in a 9 hour time window might be considered efficient where as a job that runs for 5 mins in a 10 min time window, might not. A build op for everything, as little a work as trimming is, IMHO, to much extra work. If that were the case I...