All VARCHAR2 fields are generated as CHAR in the control file, so if the data exceeds 256 characters the load fails. Arrrrrrrrrrrrgh!
ORABULK and Varchar fields?
Moderators: chulett, rschirm, roy
ORABULK and Varchar fields?
Does anyone know if the problem with the ORABULK stage generating bad control files for VARCHAR2 fields still exists in 7.1? It is an issue in 7.0.1 and just majorly bit me in the behind.
All VARCHAR2 fields are generated as CHAR in the control file, so if the data exceeds 256 characters the load fails. Arrrrrrrrrrrrgh!
All VARCHAR2 fields are generated as CHAR in the control file, so if the data exceeds 256 characters the load fails. Arrrrrrrrrrrrgh!
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
ray.wurlod
- Participant
- Posts: 54595
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Although it's not documented directly, the ctl file that ORABULK generates allocates a 256 byte buffer for each column, by virtue of the fact that it doesn't specify the buffer size. It's been that way since version 1.0 (indeed, the ORABULK stage is still at version 1.0 if you check out its stage properties).
The only solution is to modify the ctl file that ORABULK stage produces, or to have the ctl file pre-written (perhaps by an ORABULK stage) and use that to feed sqlldr.
The only solution is to modify the ctl file that ORABULK stage produces, or to have the ctl file pre-written (perhaps by an ORABULK stage) and use that to feed sqlldr.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Well... our current job design does have the ORABULK stage building only the control file and the data file is spit out by a Sequential File stage. Almost 500 jobs pull data from Oracle 'CDC' views and then in a single subsequent step (korn scripted) they are loaded in parallel using sqlldr.
I suppose we could write some sort of 'intervention' script that 'fixes' the affected control files before the bulk load step, but I'd much rather get the darn stage fixed so it doesn't generate bogus control files.
I've got a case open with Ascential, we'll see what they say about it.
Hopefully, once our source configuration settles down (long story), we can use a static set of control files that don't have this issue. But for now...
I suppose we could write some sort of 'intervention' script that 'fixes' the affected control files before the bulk load step, but I'd much rather get the darn stage fixed so it doesn't generate bogus control files.
Hopefully, once our source configuration settles down (long story), we can use a static set of control files that don't have this issue. But for now...
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
