DSXchange: DataStage and IBM Websphere Data Integration Forum
View next topic
View previous topic
Add To Favorites
This topic has been marked "Resolved."
Author Message
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Wed Jun 14, 2017 4:29 am Reply with quote    Back to top    

DataStage® Release: 11x
Job Type: Parallel
OS: Unix
Hi,

I am facing the below error while reading the data from source using ODBC Connector. In source table all columns are NOT NULL and I am reading the table using RCP. While executing Its giving below error:

Fatal Error: Attempt to setIsNull() on the accessor interfacing to non-nullable field

Can anyone suggest something how to resolve the issue.

_________________
Atul
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42163
Location: Denver, CO
Points: 216469

Post Posted: Wed Jun 14, 2017 7:24 am Reply with quote    Back to top    

Did you try anything from the other thread you posted in? Specifically printing out the schemas via $OSH_PRINT_SCHEMAS.

_________________
-craig

Walked down by the bathing pond to try and catch some sun
Saw at least a hundred schoolgirls sobbing into handkerchiefs as one
Rate this response:  
Not yet rated
Mike



Group memberships:
Premium Members

Joined: 03 Mar 2002
Posts: 1010
Location: Omaha, NE
Points: 6513

Post Posted: Wed Jun 14, 2017 7:47 am Reply with quote    Back to top    

Unfortunately the report of schemas generally gets lost when the job aborts.

1) Verify that you have no column metadata defined in the ODBC connector.
2) Verify that the sql select statement doesn't have any function that might result in NULL.
3) Replace everything after the ODBC Connector with a copy stage having no outputs.

Do you still get the fatal error?

Mike
Rate this response:  
Not yet rated
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42163
Location: Denver, CO
Points: 216469

Post Posted: Wed Jun 14, 2017 10:30 am Reply with quote    Back to top    

That is... unfortunate. Sad

If you still have the issue after going through all that, we'd probably need more details as to exactly what it is you have in the job.

_________________
-craig

Walked down by the bathing pond to try and catch some sun
Saw at least a hundred schoolgirls sobbing into handkerchiefs as one
Rate this response:  
Not yet rated
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Wed Jun 14, 2017 11:53 am Reply with quote    Back to top    

Yeah I tried using Environment variable $OSH_PRINT_SCHEMAS, but still I am getting the error. It has not resolved my issue.

_________________
Atul
Rate this response:  
Not yet rated
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Wed Jun 14, 2017 11:57 am Reply with quote    Back to top    

Hi Mike,

Below is my job design:

ODBC Stage ---> Copy Stage ---> Peek Stage

1) There is no metadata defined in ODBC connector.
2) My Select SQL statement has CAST function. I am converting all columns to VARCHAR using cast function. The column for which I am getting this error is of Numeric Type.
3) I have only Peek stage after copy stage in my job design.

The issue is still there. Kindly let me know if you need any more details in order to analyze the issue.

_________________
Atul
Rate this response:  
Not yet rated
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Wed Jun 14, 2017 11:58 am Reply with quote    Back to top    

Hi Craig,

I have mentioned my job design in my previous comment. Please let me know what more details you need in order to get more details about issue.

_________________
Atul
Rate this response:  
Not yet rated
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42163
Location: Denver, CO
Points: 216469

Post Posted: Wed Jun 14, 2017 12:43 pm Reply with quote    Back to top    

Thanks. All I have right now is just to clarify that adding that "print schemas" APT variable was not meant to fix anything. It was to log additional information so that you could figure out what might be going on.

If you added it and its information managed to survive the job's abort, can you post the output from it here?

_________________
-craig

Walked down by the bathing pond to try and catch some sun
Saw at least a hundred schoolgirls sobbing into handkerchiefs as one
Rate this response:  
Not yet rated
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42163
Location: Denver, CO
Points: 216469

Post Posted: Wed Jun 14, 2017 12:48 pm Reply with quote    Back to top    

As Arnd mentioned in the linked thread, the use of the CAST function in your source SQL creates (as far as DataStage is concerned) a new column that is nullable. It doesn't mean you have null values in the data, just that the job thinks there could be, which from what I recall triggers the error when it tries to check for them. Mike, have you seen that behavior before?

Your print schemas output would show if that is what is happening, if it is in the log.

_________________
-craig

Walked down by the bathing pond to try and catch some sun
Saw at least a hundred schoolgirls sobbing into handkerchiefs as one
Rate this response:  
Not yet rated
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Thu Jun 15, 2017 12:22 am Reply with quote    Back to top    

Here is the output for $OSH_PRINT_SCHEMA

Code:
main_program: Schemas:
Data set "ODBC_Connector_0:DSLink2.v":
record
( BICOD: string[max=10];
  BIBKC: string[max=10];
  BIBTN: string[max=10];
  BIANO: string[max=10];
  BICUR: string[max=10];
  BITRN: string[max=10];
  BIAMT: string[max=10];
  BICHQ: string[max=10];
  BIPDT: string[max=10];
  BIBTN2: string[max=10];
  BIVDAT: string[max=10];
  BILOC: string[max=10];
  BIPBI: string[max=10];
  BIPHR: string[max=10];
  BINRRN: string[max=10];
  BIPRRN: string[max=10];
  BITRRN: string[max=10];
  BIPRDT: string[max=10];
  BIPUDI: string[max=10];
  BIPHRP: string[max=10];
  BIPSTM: string[max=10];
  BICLNO: string[max=10];
  BISCBL: string[max=10];
  BIDEL: string[max=10];
  BIINTE: string[max=10];
  BIUMD: string[max=10];
  BIPNST: string[max=10];
  BISDI: string[max=10];
  BICMS: string[max=10];
  FILLER: string[max=10];
  BINFLG: string[max=10];
  BINTC_NBR: string[max=10];
  BIFPST: string[max=10];
  BIDRCR: string[max=10];
  BICHCD: string[max=10];
  BIITCD: string[max=10];
  BISBCD: string[max=10];
)
Data set "Copy_60:DSLink54.v":
record
( BICOD: string[max=10];
  BIBKC: string[max=10];
  BIBTN: string[max=10];
  BIANO: string[max=10];
  BICUR: string[max=10];
  BITRN: string[max=10];
  BIAMT: string[max=10];
  BICHQ: string[max=10];
  BIPDT: string[max=10];
  BIBTN2: string[max=10];
  BIVDAT: string[max=10];
  BILOC: string[max=10];
  BIPBI: string[max=10];
  BIPHR: string[max=10];
  BINRRN: string[max=10];
  BIPRRN: string[max=10];
  BITRRN: string[max=10];
  BIPRDT: string[max=10];
  BIPUDI: string[max=10];
  BIPHRP: string[max=10];
  BIPSTM: string[max=10];
  BICLNO: string[max=10];
  BISCBL: string[max=10];
  BIDEL: string[max=10];
  BIINTE: string[max=10];
  BIUMD: string[max=10];
  BIPNST: string[max=10];
  BISDI: string[max=10];
  BICMS: string[max=10];
  FILLER: string[max=10];
  BINFLG: string[max=10];
  BINTC_NBR: string[max=10];
  BIFPST: string[max=10];
  BIDRCR: string[max=10];
  BICHCD: string[max=10];
  BIITCD: string[max=10];
  BISBCD: string[max=10];
)
Operator "ODBC_Connector_0":
output 0 interface:
record
( BICOD: string[max=10];
  BIBKC: string[max=10];
  BIBTN: string[max=10];
  BIANO: string[max=10];
  BICUR: string[max=10];
  BITRN: string[max=10];
  BIAMT: string[max=10];
  BICHQ: string[max=10];
  BIPDT: string[max=10];
  BIBTN2: string[max=10];
  BIVDAT: string[max=10];
  BILOC: string[max=10];
  BIPBI: string[max=10];
  BIPHR: string[max=10];
  BINRRN: string[max=10];
  BIPRRN: string[max=10];
  BITRRN: string[max=10];
  BIPRDT: string[max=10];
  BIPUDI: string[max=10];
  BIPHRP: string[max=10];
  BIPSTM: string[max=10];
  BICLNO: string[max=10];
  BISCBL: string[max=10];
  BIDEL: string[max=10];
  BIINTE: string[max=10];
  BIUMD: string[max=10];
  BIPNST: string[max=10];
  BISDI: string[max=10];
  BICMS: string[max=10];
  FILLER: string[max=10];
  BINFLG: string[max=10];
  BINTC_NBR: string[max=10];
  BIFPST: string[max=10];
  BIDRCR: string[max=10];
  BICHCD: string[max=10];
  BIITCD: string[max=10];
  BISBCD: string[max=10];
)
Operator "Copy_60":
input 0 interface:
record
( inRec: *;
)
output 0 interface:
record
( outRec: *;
)
Operator "Peek_62":
input 0 interface:
record
( BICOD: string[max=10];
  BIBKC: string[max=10];
  BIBTN: string[max=10];
  BIANO: string[max=10];
  BICUR: string[max=10];
  BITRN: string[max=10];
  BIAMT: string[max=10];
  BICHQ: string[max=10];
  BIPDT: string[max=10];
  BIBTN2: string[max=10];
  BIVDAT: string[max=10];
  BILOC: string[max=10];
  BIPBI: string[max=10];
  BIPHR: string[max=10];
  BINRRN: string[max=10];
  BIPRRN: string[max=10];
  BITRRN: string[max=10];
  BIPRDT: string[max=10];
  BIPUDI: string[max=10];
  BIPHRP: string[max=10];
  BIPSTM: string[max=10];
  BICLNO: string[max=10];
  BISCBL: string[max=10];
  BIDEL: string[max=10];
  BIINTE: string[max=10];
  BIUMD: string[max=10];
  BIPNST: string[max=10];
  BISDI: string[max=10];
  BICMS: string[max=10];
  FILLER: string[max=10];
  BINFLG: string[max=10];
  BINTC_NBR: string[max=10];
  BIFPST: string[max=10];
  BIDRCR: string[max=10];
  BICHCD: string[max=10];
  BIITCD: string[max=10];
  BISBCD: string[max=10];
  inRec: *;
)
.

_________________
Atul
Rate this response:  
Not yet rated
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Thu Jun 15, 2017 12:24 am Reply with quote    Back to top    

And I am getting this error with BIAMT Column

ODBC_Connector_0,0: Fatal Error: Attempt to setIsNull() on the accessor interfacing to non-nullable field "BIAMT".

Also I have tried running the job by manually defining the column in column definition and marked the nullability to yes on ODBC Connector stage and my job ran fine.

But the issue is, is there any option to force the nullability to yes (true) as we have in ODBC enterprise stage...

I need to have RCP in my job and I am using ODBC Connector stage.

_________________
Atul
Rate this response:  
Not yet rated
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Thu Jun 15, 2017 5:00 am Reply with quote    Back to top    

Hi. I have resolved the issue. while using cast function in source sql we were defining the length of varchar less than the actual length in As400(Source DB)

In Source length was 13 and I was reading it like CAST(BIAMT AS varchar (10)) AS BIAMT

I increased the length to 142 and its running fine now.

marking this post as resolved

_________________
Atul
Rate this response:  
Not yet rated
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42163
Location: Denver, CO
Points: 216469

Post Posted: Thu Jun 15, 2017 6:42 am Reply with quote    Back to top    

See, I told you it would help you figure it out. Wink

_________________
-craig

Walked down by the bathing pond to try and catch some sun
Saw at least a hundred schoolgirls sobbing into handkerchiefs as one
Rate this response:  
Not yet rated
Mike



Group memberships:
Premium Members

Joined: 03 Mar 2002
Posts: 1010
Location: Omaha, NE
Points: 6513

Post Posted: Thu Jun 15, 2017 9:41 am Reply with quote    Back to top    

Nice job of narrowing down the cause.

It's good to know that a failed cast function in the ODBC Connector results in NULL.

You mentioned AS400, so is your underlying database DB2?

Mike
Rate this response:  
Not yet rated
Display posts from previous:       

Add To Favorites
View next topic
View previous topic
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum



Powered by phpBB © 2001, 2002 phpBB Group
Theme & Graphics by Daz :: Portal by Smartor
All times are GMT - 6 Hours