I am loading a 14-20 million record file Sequential file using the paralle Sequential stage into Oracle Enterprise and at times the load cannot read a single record and fails (stops loading the Oracle table). Instead of discarding the record and continuing with the load. I have tried every option here CONTINUE, OUTPUT and FAIL. Fail is the only one that seems to work. I do not want the load to ever stop I want it to load as much as it can and discard what it can't load or send to output. Is there anyway to get this to work.
Also, why is it that when I restart the job from the begining it completes without an issue. It only halts the first time through. Is this a setting that I have incorrect?
Nevermind, I have since found it is a file attribute of the input file. Rewriting the file before Ascential processes it appears to have corrected the issue.