Not really sure, but just went thru this exercise in frustration quite recently, needing to analyze some tables. For whatever reason, it would only work in an anonymous block for us and the only way to accomplish that was in the SP stage. Be glad to be proven wrong but that's the conclusion we came ...
The only way you'll be able to get that to work is via the Stored Procedure stage as it needs to be executed in an anonymous block. The 'normal' call syntax won't work.
It's not like COBOL. There's no "working storage" and variables do not need to be explicitly declared. Also, there is no size or type associated with variables. Best to initialize them to a value so they start off live in a known state, however. Are you sure you need to go this route, however? Are y...
The job is being used in a few another job sequences. But at the time I am running that job no other sequence is getting executed. Ok, that's good to know. Now you need to follow what Ray posted - take the job number, use his query to get the name associated with that number and clear that job's lo...
i wonder how you can make Execute Command activity fail/pass if source file is empty/full. By checking the result of the command. Perhaps it counts the number of records in the file, so you check to see if the returned value is 0 or >0. Perhaps it simply tests to see if the file exists and is empty...
As there is always a problem of scalability with Hash files and also the tables on which we are doing a lookup are monstoor dimensions- So we can't do without it. As a DataStage Consultant, you need to have more faith in hashed files. Ken Bland has a great writeup on them in one of the recent Newsl...
True, but I don't recall an option to give you a cartesian product between the two files. I would also think it would be... slower, depending on the file sizes involved.