I am running IA on AIX with ODBC pointed at SQL Server.
Some columns have TEXT data type (like a CLOB) which are variable width up to 2 GB in size. Customer was hoping IA would do data classification on these columns and detect sensitive data if present, which is an advertised feature of the tool. Does it not work on large objects? Is there a workaround?
So far I have found that IA reports such columns as data type "STRING" with length "--" and status as "Error." The option to "Include columns with length over 512 characters" is enabled. The corresponding DataStage job log contains a fatal message about having the array size set to 1 for LOB columns.
IA and ODBC SQL Server TEXT data type
IA and ODBC SQL Server TEXT data type
Choose a job you love, and you will never have to work a day in your life. - Confucius
We are on 11.3 and have the same issue. Our IT folks have tried different drivers, JDBC, etc, to no avail. We have a PMR open and so far, IBM has not helped resolve the issue. We're hoping we would have better luck when we install 11.5 at yearend but I see you're on 11.5.
The workaround we will attempt is cast the data element in Datastage and write to another table.
The workaround we will attempt is cast the data element in Datastage and write to another table.
Todd Ramirez
Sr Consultant, Data Quality
San Antonio TX
Sr Consultant, Data Quality
San Antonio TX
I am also leaning towards the same type of workaround.
I am confused about the setting to include columns with length over 512 characters. The seems to suggest that it could read a large column, or at least read the first 512 characters of it. ??
I am confused about the setting to include columns with length over 512 characters. The seems to suggest that it could read a large column, or at least read the first 512 characters of it. ??
Choose a job you love, and you will never have to work a day in your life. - Confucius