OCI has fetched truncated data

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
paddu
Premium Member
Premium Member
Posts: 232
Joined: Tue Feb 22, 2005 11:14 am
Location: California

OCI has fetched truncated data

Post by paddu »

Hi all


i am perfoming aggregation on certain columns and it throws me this warning and aborts ( does not show warning at which record but almost for all the records)
DIRECT_SALES_test_2006913..tf_ORDERTYPE: OCI has fetched truncated data

Size of the table 8Million

My aggregation is through SQL query
SELECT ORDER_ID,MATERIAL,
SUM(SALES_SALES),
SUM(SALES_QNTY),
AVG(SALES_UNIT),
SUM(RETURN_SALES),
SUM(RETURN_QNTY),
AVG(RETURN_UNIT),
SUM(CREDITMEMO_SALES),
SUM(CREDITMEMO_QNTY),
AVG(CREDITMEMO_UNIT),
SUM(SAMPLES_SALES),
SUM(SAMPLES_QNTY),
AVG(SAMPLES_UNIT) FROM DIRECT_SALES_STG_01
GROUP BY ORDER_ID,MATERIAL

If i use Aggregator stage it throws this error

DataStage Job 823 Phantom 3236
Program "DSP.ActiveRun": Line 51, Exception raised in GCI subroutine:
Access violation.
Attempting to Cleanup after ABORT raised in stage DIRECT_SALES_test_2006913..Aggregator_15
DataStage Phantom Aborting with @ABORT.CODE = 3


Please help me on this :(

thanks
paddu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Does the query work in another client, such as TOAD or sqlplus, or does it run out of resources there also?

Summing 8 million values can quite possibly lead to a value larger than can fit in the target data type. You might like to use larger numeric data types than you currently have specified.

Correct the Aggregator stage problem using ORDER BY ORDER_ID,MATERIAL in your extraction query and specifying that the data are thus sorted.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
paddu
Premium Member
Premium Member
Posts: 232
Joined: Tue Feb 22, 2005 11:14 am
Location: California

Post by paddu »

Ray,

it runs fine in Toad , i even checked the max count of the sum of columns
it does not exceed the limit .

but few columns which i summing and doing AVG have zeros or negative values mostly, Does this affect?



Thanks
paddu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

No, you can still form a sum or an average even though some of the values are non-positive. Nulls might cause an issue, but you can extract these using the NVL function. No, it's not that.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
paddu
Premium Member
Premium Member
Posts: 232
Joined: Tue Feb 22, 2005 11:14 am
Location: California

Post by paddu »

[quote="ray.wurlod"]No, you can still form a sum or an average even though some of the values are non-positive. Nulls might cause an issue, but you can extract these using the NVL function. No, it's not that. ...[/quote]

Ray ,

i tried using NVL but same error .
The same query works fine in TOAD.

Please help me

Thanks
paddu
NBALA
Participant
Posts: 48
Joined: Tue Jul 11, 2006 11:52 am
Location: IL, USA

Post by NBALA »

As Ray mentioned early it might due to the value which can not fit in the datastage data definition. Check the SQL type , Length and Scale.

-NB
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Do a reset of this job, and check, if you find any "From Previous run..." log. Try to limit the number of rows for a run, and check if the job able succeed. If not aborted with various less number of set, you can try increasing some of the data type length or the disc space.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
paddu
Premium Member
Premium Member
Posts: 232
Joined: Tue Feb 22, 2005 11:14 am
Location: California

Post by paddu »

resolved this issue .

The issue was with AVG columns . I used round function in my sql query to get rid of decimal place after 5 and it worked fine.
You all were right ,data was not fitting into datastage data definintion.

Thanks


Now i have new issue .

We have 20 job sequences . All are put together in one Master sequence.
one of the sequence failed with this error

Controller problem: Unhandled abort encountered in job2

failed sequence has 2 jobs.
Job1: reads a flat file and loads oracle staging table(staging database) (with some transformations)
Job2 : reads the oracle staging table, performs aggregation and loads into oracle table.(target database)

Job2 failed . i have reset the job2 and ran again but no data has fetched and loaded. I manually stopped this job
Usually it takes 1 min to run job2 .

I ran the job2 sql query in staging database , it shows the result but i do not know why it does not work in ascential.

Do i have to reboot the ETL server ?I may be wrong though.

Can anyone tell me why it is slow and not reading and writing .


thanks
paddu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

New issue: new thread. Makes it easier for future searches.

Please begin a new thread.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply