Error when using CLOB data type in Job

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
subhashini
Participant
Posts: 8
Joined: Thu Nov 23, 2006 8:43 am

Error when using CLOB data type in Job

Post by subhashini »

Iam facing problem when iam using CLOB datatype for one column in the parallel job. column datatype is CLOB both in source and target tables

when I import the metadata for this column datastage is converting as LONGVARCHAR, when I run this job , I am getting following error.



main_program: Internal Error: (colDesc):orautils.C: 1714: getColDesc failed.
Traceback: [6] APT_FatalPath::msgAssertion(0xfdcbacff, 0xffbfbe38, 0xfdcbad07, 0x6b2, 0x0, 0xccc70), at 0xff0dad18
[7] APT_OraUtils::querySchema(0x0, 0xfdcdd2e0, 0xffbfbdc8, 0xffbfbf18, 0xffbfbf24, 0xfdcd5498), at 0xfdc23dc0
[8] APT_OraReadOperator::describeOperator(0xaec58, 0x0, 0x167cb0, 0xffbfc470, 0xfdcdcce0, 0xfdc05ae4), at 0xfdbfe788
[9] APT_OperatorRep::wrapDescribeOperator(0x140458, 0x0, 0x1, 0x7088f4, 0xffbfc6bc, 0xb4), at 0xfe70b354
[10] APT_OperatorRep::check1a(0x140458, 0x1800, 0xfed6db08, 0xff12d168, 0xfee138b8, 0xfee4e50c), at 0xfe70b8bc
[11] APT_StepRep::sequenceAndCheck1Operators(0x127ea0, 0x0, 0x0, 0xfed79076, 0xfee138b8, 0x0), at 0xfe788510
[12] APT_StepRep::check(0x127ea0, 0x68c264, 0x0, 0x0, 0xfee138b8, 0x127eac), at 0xfe787858
[13] APT_Step::check(0xcbdb0, 0xffbfce80, 0xffbfcdc4, 0xff134488, 0x0, 0xfee138b8), at 0xfe77b964
[14] APT_OSL::createAndCheckStep(0x1305f0, 0xffbfe218, 0x5d530, 0xffbfce48, 0xffbfce80, 0x3c78c), at 0xff220d90
[15] APT_ORCHESTRATE_main(0x1, 0x2b8f2, 0xffbfe350, 0x4, 0xffbfe4b8, 0xffbfe4d4), at 0x1fcd4


Pls help me out to come out of this problem.

Thanks in advance.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

As per documenet, supported LOBs are LOB and BLOB.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

And CLOB will be mapped to SQL_LONGVARDCHAR. The other restriction suggested is, the max of 2GB. And the length need to be mentioned as 4KB in Columns tab.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
subhashini
Participant
Posts: 8
Joined: Thu Nov 23, 2006 8:43 am

Post by subhashini »

When I imported metadata CLOB is converted to LONGVARCHAR and size is 4000 bytes. But the source table doesn't have any data. I developed the job and try to run the job with out any data.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Any idea what the max data size is you'll be pulling from the CLOB for this job? Is your data source 10g by any chance?
-craig

"You can never have too many knives" -- Logan Nine Fingers
subhashini
Participant
Posts: 8
Joined: Thu Nov 23, 2006 8:43 am

Post by subhashini »

Till now we didn't hv source data, but source system is not 10g , it is 9i.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

So you are getting this error even when the data doesn't flow in the sequence.
How did you imported the metadata?
Internal Error: (colDesc):orautils.C: 1714: getColDesc failed.
Try orchbuild utility to import metadata and check if that helps by any chance.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

subhashini wrote:Till now we didn't hv source data, but source system is not 10g , it is 9i.
Too bad, there's a loverly little package in 10g - the DBMS_LOB package - that makes working with those fields types much easier. Which is why I asked.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

The option Craig will be help full in converting the data types from Oracle and load into oracle. Not sure if that will be much helpful in your case, if you could not convert access it from oracle to Datastage. But may be it can be used to convert to other format that Datastage can access easily.
Have you tried running with data in that field? Does that field involve any transformation?
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Post Reply