Use a Start Loop and End Loop in a sequence job. pass the file names as a list to the start loop. Specify #StartLoopName.$Counter# as the derivation for the job parameter. It will take all the file names as a list and pass it to your job. This way your job will process each file individually. If the...
You need to pass it as parameters to your FTPFile. Like FTPFile myServerName userMe passwdMe. Your bat file should accept these. Once you can get your FTPFile running from command line. Then you can edit your Cmd as cmd = 'ftp -s:c:\FTPFile ':myServerName userMe passwdMe where myServerName userMe pa...
As in are you going to edit the DSParams file programatically or how are you planning to change the environment variable. And as far as i know, environment variables is limited to string types. As far as i know.
You wont be deleting anything. The hashed file will contain only keys which have duplicates in your source file. Then you run your source file again against this hashed file and do a NOT of NOTFOUND for duplicate records and NOTFOUND non duplicate records. No data will be deleted or missed.
What version of DataStage do you have.? If 7.5x then you must be having STP stage. Use that. If you are expecting return parameters then doing a simple "call stp" from before/after sql tab is out of question. Calling a parameter from odbc stage has its limitations. It can only support IN IN/OUT para...
I could eliminate my frustration if I just created two separate jobs, and didn't try to reuse existing jobs, but I thought I'd give it a try. I think that could be your only way. I do not see how you can re-use the same job with different meta data for different targets. If the metadata was identic...
Do this. Sort the incoming data on your key. Define two stage variables in the transformer, say condFlag and prevVal. The will basically detect duplicates and flag them. Their both will be initialized to 0. Their derivation will be as follows: condFlag | if (prevVal <> src.key) then 'X' else...
Well that depends on what the Stored Procedure is doing. Are you retrieving columns from it to load it somewhere or just executing the Stored Procedure that does a few tasks which, from DataStage stand point, your not concerned about?
A little bit off topic but tell me this, What does this date column represent? Is that the current timestamp ? Or a different datetime value needed from the source?