Abort the job after certain number of rejects

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
balu536
Premium Member
Premium Member
Posts: 103
Joined: Tue Dec 02, 2008 5:01 am

Abort the job after certain number of rejects

Post by balu536 »

I am having a source file A with a data of one million records. I am loading that file through the sequential file stage SF_A

In the sequential file stage SF_A I used reject mode as "output " so that I can direct reject records to a another sequential file SF_Rejects through a transformer TF_A. In the transformer output link I kept an option of abort after 101 records.

Now the problem is, job is getting aborted after 100 rows but the sequential file for the rejects is not creating. No file is created in the respective file path. Below is logical flow of the job i created.

SF_A -> Copy Stage -> Peak stage
|
|(reject_flow)
|
TF_A(Transformer) ->SF_Rejects (Sequential File Stage).

How can i overcome this problem, so that abort of the job and 100 records can be loaded into the sequential file stage at a time? So that client can save the processing time of the one million records if there are 100 rejects in the file.


Any suggestions for my problem?
Sreedhar
Participant
Posts: 187
Joined: Mon Oct 30, 2006 12:16 am

Post by Sreedhar »

Hi,

That is not possible in the job. Try creating a preprocessor job where you check for the anomalies in the file.
Regards,
Shree
785-816-0728
Ravi.K
Participant
Posts: 209
Joined: Sat Nov 20, 2010 11:33 pm
Location: Bangalore

Post by Ravi.K »

Check the below option whether it is helpful.

Change the propery "Clean up on Failure=FALSE" at SF_Rejects (Sequential File Stage).
Cheers
Ravi K
jwiles
Premium Member
Premium Member
Posts: 1274
Joined: Sun Nov 14, 2004 8:50 pm
Contact:

Post by jwiles »

Unfortunately, due to the parallel nature of the processes, you're not guaranteed that the reject output seqfile has processed all 100 reject records before receiving the abort indication from the conductor/section leader.

You might have better luck using the Abort operator but there is no promise. Use the generic stage to call abort and pass options to it. Enter osh -usage abort at a command line to get a list of options.

Regards,
- james wiles


All generalizations are false, including this one - Mark Twain.
richdhan
Premium Member
Premium Member
Posts: 364
Joined: Thu Feb 12, 2004 12:24 am

Post by richdhan »

Hi,

Try using $APT_EXPORT_FLUSH_COUNT environment variable in your job. This allows for moving the data from the buffer to the file.

HTH
--Rich
IBM Certified DataStage Solution Developer | Teradata Certified Master
Post Reply