Hi
I am using sequential file stage with schema file but stage is not reading the data and giving warning messages below
When validating export schema: At field "BQR": Exporting nullable field without null handling properties
Field "BQR" delimiter would overrun input buffer, at offset: 2
Import unsuccessful at record 0.
"PROB_DFT_PCT": Exporting nullable field without null handling properties
record
{final_delim=end, record_delim='^', delim='^', quote=none, padchar='#'}
(
BQR:nullable string[2] {null_field=''};
PROB_DFT_PCT:nullable decimal[7,3] {quote=none, null_field=''};
Data file
01^000.090^
02^000.220^
03^000.610^
04^001.400^
05^002.850^
06^005.050^
07^010.250^
08^034.000^
09^095.000^
Canyou please advise me in this regard
Thanks
Srini
sequential file stage using schema file
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 38
- Joined: Sat Dec 29, 2007 9:58 am
You have apparently defined the record_delim and delim options to use the same value, as shown in the record schema you provided.
Are the records truly delimited with a '^' character, or do they have a unix linefeed or dos/windows carriage return/linefeed delimiter instead?
Are the records truly delimited with a '^' character, or do they have a unix linefeed or dos/windows carriage return/linefeed delimiter instead?
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Your Sequential File stage used for writing does not have Null Field Value property in the Format or Columns tab, but your design includes at least one column marked as nullable.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 38
- Joined: Sat Dec 29, 2007 9:58 am
jwiles wrote:You have apparently defined the record_delim and delim options to use the same value, as shown in the record schema you provided.
Are the records truly delimited with a '^' character, or do they have a unix linefeed or dos/windows carriage return/linefeed delimiter instead?
'^' charecter is there at end of the First and second field(column) in the file. As a field delimeter.
Thanks
Srini
-
- Premium Member
- Posts: 38
- Joined: Sat Dec 29, 2007 9:58 am
what should I specify for Null Field Value property in sequential file property page under Format Tab.ray.wurlod wrote:Your Sequential File stage used for writing does not have Null Field Value property in the Format or Columns tab, but your design includes at least one column marked as nullable.
Can you please explain me step by step becz I am new to DS
thanks
Srini
So '^' is a field delimiter. Why do you also have it as the record delimiter? Correctly defining the record delimiter should take care of at least part of the problem.
Null Field Value: (BTW, this info is in the Parallel Job Developer Guide...good resource for you to read)
Sequential Files, by their nature, don't have NULL columns (NULL is a particular condition which indicates there is no data for a column). You use the Null Field Value to define the value to place into the sequential file column in place of a NULL when WRITING a file, or what value to INTERPRET as (replace with) a NULL when READING a file.
Regards,
Null Field Value: (BTW, this info is in the Parallel Job Developer Guide...good resource for you to read)
Sequential Files, by their nature, don't have NULL columns (NULL is a particular condition which indicates there is no data for a column). You use the Null Field Value to define the value to place into the sequential file column in place of a NULL when WRITING a file, or what value to INTERPRET as (replace with) a NULL when READING a file.
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.