Teradata RDBMS code 2617: Overflow occurred

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Teradata RDBMS code 2617: Overflow occurred

Post by bcarlson »

We have a job failing in production (and elsewhere) with the following error message:
9713 FATAL Sat Sep 4 08:21:03 2010 heidswk_dly_cust_acct_drv,0: RDBMS code 2617: Overflow occurred computing an expression involving dly_cust_acct_drv_E2.CUST_ACCT_ID SQL statement: LOCK ROW FOR ACCESS SELECT CAST(cust_acct_id AS DECIMAL(20)), CAST(cust_acct_typ
_cd AS CHAR(5)), ... etc. ... CAST(stat_au_chnl_cd AS CHAR(4)) FROM P_EIW_W_T_UTILWORK_01.dly_cust_acct_drv_E2 (CC_TeraConnection::executeSel
ect, file CC_TeraConnection.cpp, line 2,458) [pxbridge.C:5949]
We are using DS 8.1 and Teradata 13. We are using the Teradata Connector to read/write from the database. In this case, it is doing a bulk write to the target table.

The E2 table referenced is the error table used by FastLoad to capture uniqueness vioations. The input has 3.5M records. The target table actually gets 100% of these records. The E2 table gets loaded with 5.1M records from who knows where... From our testing, we cannot find where the duplicates originate. I can create a test job that writes to a dataset instead of a target table and the dataset is 100% unique.

Even more confusing is the error message itself. The cust_acct_id field is decimal(20,0) in the input datastream and the target table. There is no datatype conversion going on, at least in terms of code (I understand the underlying storage in DS is probably string and this gets converted to a Teradata decimal datatype as the data is loaded).

I have a test job in developmentthat fails the same way. It reads a dataset, passes through a copy stage (no column modifications going on) and out to the target table. No logic or transformations adn still it fails with this 2617 error. The only real difference is that there were duplicates in the input dataset. By why a datatype error instead of a warning/error for the duplicates?

Any suggestions?

Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

"Overflow" suggests that the data are too large to fit into the designated CHAR or DECIMAL data types.

But you wouldn't do the CAST in reading from a Data Set.

I can't see how duplicates would generate this kind of error message.

Is P_EIW_W_T_UTILWORK_01.dly_cust_acct_drv_E2 a table or a view? If a view, are there any calculations being performed by the view-defining expression?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

The E2 object is a table that is generated at runtime by DataStage. It is a copy of the target table's layout without a unique index on it so that it can capture the entire record without failing uniqueness itself.

For the development job, I tested the input dataset and the min/max cust_acct_id are all 20 digits or less. No non-numeric values anywhere.

Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

How about the codes (CHAR(4) and CHAR(5))?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

I didn't check any other fields. I guess I thought the error message was specific to the cust_acct_id field:
RDBMS code 2617: Overflow occurred computing an expression involving dly_cust_acct_drv_E2.CUST_ACCT_ID
Bad assumption?

Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Probably not, but not all errors are accurately reported, and we've reached the "clutching at straws" phase.

Is there anything useful in the Teradata error documentation about this particular error code?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

Believe it or not, I think it is a 'large decimal' issue. My suspicion is that it boils down to a bug either in DataStage 8.1, Teradata v13, or TTU13 (Teradata Tools and Utilities, the API for tools to talk to Teradata).

I created a test job that fails with the 2617 error if the datatype is decimal(20), and gives a more normal unique key violation if the datatype is decimal(18).

Up until Teradata V2R6.2, the largest decimal Teradata could handle was Dec(18). After V2R6.2, it could handle Dec(38) (or something like that).

In DataStage, the old utilties like Teradata Enterprise and the API stages do not accomodate these large decimals. The Teradata Connector, on the other hand, does just fine. The reason is that the old stages use Teradata's old utilities - FastLoad, MultiLoad, FastExport and Tpump. The Connector uses the TPT, Teradata Parallel Transport. As I understand it, the TPT was updated for large decimals, the original utilties were not.

It seems DS can write to a table with decimal(20) with no issues, but as soon as duplicates need to be forwarded to a UV table, it breaks down.

Now that I have a test job, I think we'll be opening a PMR with IBM.

Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
Post Reply