You're hijacking this thread. Start a new one so we know more information about your OS, DS release, etc. As far as your problem, you took too long to insert/update your data and Oracle ran out of space holding your rows in a temporary table. Running it again successfully means that it finished within the available space at that moment in time. Start a new thread if you want more information.scottr wrote:Hi Bland what's this "database rollback segment issues, snapshot too old"
error,one of my jobs are aborting giving this message and if i reset it and run again it just runs fine without failing..
thanks
Abnormal End error while running a DS load job into Oracle
Moderators: chulett, rschirm, roy
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
rrcreddy4 wrote:Hi,
I did one thig, While trying to load in to Oracle using OCI9 and I started copying the data in to a sequential file at the same time to see if there is an issue with data, When I do that I have no issue.
Does that trigger anything???
How do I turn off row buffer?
RC
Ahhh. Go to Job Properties and the Performance tab. Deselect Use project defaults and deselect any choice selected. Remove the text file and recompile and run your job. I think your problem is related to the row buffering, we shall see.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Hi,
I tried by deselecting row buffer and still got the below error.
Project:smqa (whseqa)
Job name:LdTouchpointFactDICatalogs
Event #:217
Timestamp:2/1/2005 1:18:36 PM
Event type:Info
User:qdssmqa
Message:
From previous run
DataStage Job 237 Phantom 24621
Abnormal termination of DataStage.
Fault type is 10. Layer type is BASIC run machine.
Fault occurred in BASIC program *DataStage*DSR_LOADSTRING at address 6ec.
RC
I tried by deselecting row buffer and still got the below error.
Project:smqa (whseqa)
Job name:LdTouchpointFactDICatalogs
Event #:217
Timestamp:2/1/2005 1:18:36 PM
Event type:Info
User:qdssmqa
Message:
From previous run
DataStage Job 237 Phantom 24621
Abnormal termination of DataStage.
Fault type is 10. Layer type is BASIC run machine.
Fault occurred in BASIC program *DataStage*DSR_LOADSTRING at address 6ec.
RC
Make sure there's no runaway threads out there. Do a "ps -ef |grep LdTouchpoint" to see if any pieces and parts are interfering. If they are, do a kill on them to get rid of them.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
Check whether you are having any other process in your system / organization that reads files from that directory and work on them or move them around. From my experience with C and file pointers, any change in the file pointer using one program will alter its location globally to all processes.
Rather than loading into Oracle, try writing it to a seq file only to confirm that there is no problem in the link as such.
Check whether you have any code in your job control by change as they may come into effect.
Rather than loading into Oracle, try writing it to a seq file only to confirm that there is no problem in the link as such.
Check whether you have any code in your job control by change as they may come into effect.
I did the check on link it is fine.
When I do a full insert in to the oracle table I am fine, all 1M rows goes in. Only issue is when I chose Insert or Update clause into Oracle.
Only thing I see with Oracle table is I have a function based unique constraint on the table, i.e. during each insert, it checks for the unqiue ness of the data based on the function I have. i don't see it different from having a primary key on the table. you can ask me why can't I have the primary key on this table. I really don't have a primary key on this table. I want the uniqueness for a specific condition thats why I am going for a condition based unique contraint.
Please suggest me.
RC
When I do a full insert in to the oracle table I am fine, all 1M rows goes in. Only issue is when I chose Insert or Update clause into Oracle.
Only thing I see with Oracle table is I have a function based unique constraint on the table, i.e. during each insert, it checks for the unqiue ness of the data based on the function I have. i don't see it different from having a primary key on the table. you can ask me why can't I have the primary key on this table. I really don't have a primary key on this table. I want the uniqueness for a specific condition thats why I am going for a condition based unique contraint.
Please suggest me.
RC
Drop the index and see if you have any load problems. If you do, then you get to log an issue with Ascential technical support. We're running out of options here, so if you can identify that a function based index is causing some issue to feed back to Ascential that it is unable to handle, then you know your answer.
The process of elimination continues...have fun...
The process of elimination continues...have fun...
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
The close error is from link process reading the sequential load file. An abnormal termination means that something tragic occurred and the controlling DataStage process could not recover the remaining job processes. What I'm saying is that the sequential file closing error message is related to the job blowing up, not to the reason the job is blowing up.
If your job is simply SEQ --> XFM --> OCI, then put a reject link on the XFM stage to capture rejected rows. Turn off all buffering. Set your array size to 1. Set the commit count to 1, and have your DBAs watch the loading of the table. There's not much left for us to do other than suggest try everything.
There's something, a routine, a function, a trigger, something that is causing this. You have to eliminate all variables and find it. Sorry.
If your job is simply SEQ --> XFM --> OCI, then put a reject link on the XFM stage to capture rejected rows. Turn off all buffering. Set your array size to 1. Set the commit count to 1, and have your DBAs watch the loading of the table. There's not much left for us to do other than suggest try everything.
There's something, a routine, a function, a trigger, something that is causing this. You have to eliminate all variables and find it. Sorry.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Hi,
I have 3 warning message fields, I am not using them in the transformer while loading in to the OCI9.
When I remove the 3 colums all the way, I am fine in loading. Thats surprise, I have a similar job which has 3 unused fields coming the sequential file and still loading.
Is there anything wrong with sequential files the way they bahve???
RC
I have 3 warning message fields, I am not using them in the transformer while loading in to the OCI9.
When I remove the 3 colums all the way, I am fine in loading. Thats surprise, I have a similar job which has 3 unused fields coming the sequential file and still loading.
Is there anything wrong with sequential files the way they bahve???
RC