Page 1 of 1

Exceeding Internal Limit

Posted: Fri Feb 16, 2018 11:03 am
by jackson.eyton
Hi everyone,
I have a rather simple job pulling some data from an excel file and placing it in a table. No real transformations are done to the data short of trimming spaces before and after. The column definitions coming in from the spreadsheet are varchar(256) and outgoing to my table are also varchar(256). The issue is when running the job its attempting to create the table and fields using varchar(64000), this then leads to an error that the table exceeds the 65535 byte internal limit. Full error here:
STAGE_GEOGRAPHIC_AREA_HIERARCHY: Failed to execute SQL statement: CREATE TABLE STAGE_GEOGRAPHIC_AREA_HIERARCHY (Population_Timestamp TIMESTAMP NOT NULL, PROCESSED_LAST_DATE DATE, Country_Short_Code VARCHAR(64000), Zip_Code VARCHAR(64000), Ttype VARCHAR(64000), Primary_City VARCHAR(64000), State_Code VARCHAR(64000), County_Name VARCHAR(64000), ISO_Code VARCHAR(64000), Country_Name VARCHAR(64000), State_Abbreviation VARCHAR(64000), State_Name VARCHAR(64000)) DISTRIBUTE ON RANDOM. Reason: [SQLCODE=HY000][Native=46] ERROR: Table 'STAGE_GEOGRAPHIC_AREA_HIERARCHY' record size 640012 exceeds internal limit of 65535 bytes (CC_NZConnectionRep::executeDirect, file CC_NZConnectionRep.cpp, line 201)
The Job's OSH can be reviewed here:
https://github.com/jacksoneyton/DataSta ... master/OSH

I'm not understanding why its deciding that the fields are varchar(64000), therefore I'm unsure how to resolve. Any advice would be greatly appreciated, thanks!!

Posted: Wed Feb 21, 2018 9:05 am
by jackson.eyton
Hmm, looks like no one has any thoughts on this. I will rebuild this job from scratch in this case and see if the issue persists. I'll update again once I have more.

So odd....

Posted: Wed Feb 21, 2018 10:13 am
by chulett
Well... had some thoughts about unbounded fields but did seem to see anything like that in your output so did not post. And if you search for that term, all you get are XML related results. :(

I did, however, do an exact search for your magic number of 64000 this morning and several topics turned up, including this one. Maybe one of them might shed some insight into what is going on.

Nevermind... I'm an idiot...

Posted: Thu Feb 22, 2018 9:08 am
by jackson.eyton
Never mind everyone.... User error... I copied the unstructured data stage from another job and I must have propagated column values or something and done it wrong. I had my column length in the scale field and length left blank.

Thank you Chulett for the link there, that gave me enough information to realize it was using the default max value for netezza and therefore was likely not set correctly. My eyes missed it the first time, I just saw the numbers I was expecting to see and not that those numbers were in the wrong column... My bad!!! :oops:

Posted: Thu Feb 22, 2018 10:00 am
by chulett
Aha! Was wondering why all of your sizes were in the "scale" property but figured it was some sort of secret Netezza thing. Glad you spotted that. :D