Wow, this topic is still alive. And after all this time you tell us its unix. Well, it happens If its unix then you dont even need a routine. Just append the date to the file by running afterjob subroutine 'Execsh'
how many lines does the header occupy
You can do a head -2, store it in a file and at the end, right after the creation of the sub files, concatenate them.
This is a simple lookup keyed on a single key with an extra condition that can be accomodated in the contraint. You constraint will look something like NOT(LookupLink.NOTFOUND) AND ICONV(dt_record,"D/MDY[2,2,4]") >= ICONV(dt_start,"D/MDY[2,2,4]") AND ICONV...
Is this a PX or Server questioin. The forum type and job type contradict each other. Your data seems to be in binary format. Use one of the built in functions to convert it into ascii. Search the forum for a few such functions. Also confirm the job type as your answer will vary depending upon the jo...
A couple of etiquettes that need to be kept in mind. First of all you have posted in the wrong forum. We have absolutely no idea what job type it is, what version of datastage you are using and what is your OS type. Answer to your query will vary depending upon the above. Secondly, please do not use...
What format is that file in. If its a delimited file then surely you can read it using the sequential file stage. You can also use the power of unix to make the file in a standard readable format so that datastage can pick it up.
If its a controller problem then its something else. Try making a copy of that sequence job and deleting the original one. Run the copy. Follow the log in the director and see what exactly happens.
BTW, its not a sequencer, its a sequence job. A sequencer is merely a stage inside a sequence job.
If its coming in as a binary format then i would guess that an ascii conversion is needed before you do any other sort of manipulation. Thats why we need to see your OCONV code as Craig requested to make sure you are doing the binary to ascii conversion.
Sort is needed for the uniq. And uniq is needed for the file names. And yes then the entire record pertaining to that particular Employee willl be loaded to that file which is taken care off by the grep.
Need more info on what options you are using. Does your id have permissions to write to a named pipe at the unix level?
Try the other option, other than the named pipe, cant remember it off of my head at the moment. See if that works.