Error writing to Teradata Enterprise Stage
Moderators: chulett, rschirm, roy
Error writing to Teradata Enterprise Stage
Hi,
This is my first post and I surely hope I can get some expert advice.
When using the Teradata Enterprise Stage in a Parallel Job to write data to a Teradata table, unless I am the creator of the table, regardless of the (sufficient) rights I have in the Teradata domain (create table right, full rights on the table, etc), I get the following FATAL error:
* * * * * * *
TOR_TABLE: Table 'SUPT_INVT_BASE_FACT_WORK' is not writable!
Couldn't determine if table SUPT_INVT_BASE_FACT_WORK is writable.
Cannot prepare target table.
* * * * * * *
Once I drop the table that was created by somebody else, re-create it using my logon, then the stage works. However, I can't hand over this piece of work to UAT or PRODUCTION because I can't demand that the user inserting records into a Teradata table from Datastage must be the creator of the table.
Your help will be very much appreciated.
Regards,
Bill
This is my first post and I surely hope I can get some expert advice.
When using the Teradata Enterprise Stage in a Parallel Job to write data to a Teradata table, unless I am the creator of the table, regardless of the (sufficient) rights I have in the Teradata domain (create table right, full rights on the table, etc), I get the following FATAL error:
* * * * * * *
TOR_TABLE: Table 'SUPT_INVT_BASE_FACT_WORK' is not writable!
Couldn't determine if table SUPT_INVT_BASE_FACT_WORK is writable.
Cannot prepare target table.
* * * * * * *
Once I drop the table that was created by somebody else, re-create it using my logon, then the stage works. However, I can't hand over this piece of work to UAT or PRODUCTION because I can't demand that the user inserting records into a Teradata table from Datastage must be the creator of the table.
Your help will be very much appreciated.
Regards,
Bill
Regards,
Bill
Bill
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Welcome aboard! :D
Whoever creates the table, or someone with DBA privilege, can grant appropriate privileges to anyone else, such as the user ID under which DataStage jobs will connect to Teradata.
You should make the user ID and password (and database name) job parameters, because they will be different in development, test and production environments.
Whoever creates the table, or someone with DBA privilege, can grant appropriate privileges to anyone else, such as the user ID under which DataStage jobs will connect to Teradata.
You should make the user ID and password (and database name) job parameters, because they will be different in development, test and production environments.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Well, I have very sufficient rights in the database, including create and drop table rights, all select, insert, update and delete rights on all the tables in the database, yet I had that problem.
If somebody created the table, I tested that I can drop that table, insert, update and delete records on that table, yet when using the Teradata Enterpreise stage in a Parallel Job to write to that table I got that error message from Datastage.
When I use a Teradata stage in a Server Job, I don't have that problem. However, the Teradata stage in a Server Job takes 2 hours to process, whereas in a Parallel Job it takes about 20 minutes. I have many such jobs to run in a batch so I have to choose Parallel Job.
This is giving me a lot of problem and I don't know if it can go to production, which is to happen soon.
Regards,
Bill
If somebody created the table, I tested that I can drop that table, insert, update and delete records on that table, yet when using the Teradata Enterpreise stage in a Parallel Job to write to that table I got that error message from Datastage.
When I use a Teradata stage in a Server Job, I don't have that problem. However, the Teradata stage in a Server Job takes 2 hours to process, whereas in a Parallel Job it takes about 20 minutes. I have many such jobs to run in a batch so I have to choose Parallel Job.
This is giving me a lot of problem and I don't know if it can go to production, which is to happen soon.
Regards,
Bill
Regards,
Bill
Bill
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
While you're figuring it out, you could make your server job multi-instance and run N instances each processing 1/Nth of the rows. You will, however, need to build a (parameter-driven) mechanism for selecting disjoint subsets of source rows.
This will allow you to meet your deadline, albeit as a workaround. Based on your figures, six instances should be about right.
(Based on that argument, nine women can make a baby in one month.)
This will allow you to meet your deadline, albeit as a workaround. Based on your figures, six instances should be about right.
(Based on that argument, nine women can make a baby in one month.)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Ray,
The stuff has gone to UAT this morning. I worked from 9:15am to 11:00pm last night and from 4:30am this (Saturday) morning until now and god knows till when I can go home...
I don't think it is an option to go back to the Server Jobs.
Nrevezzo,
I do use qualifying table names, etc. I am not so sure if it is a problem with security rights, because with Queryman, BTEQ, and even Datastage Server job I don't experience that problem. It is only with a Parallel job that this problem occurs.
* * * * * * *
TOR_TABLE: Table 'SUPT_INVT_BASE_FACT_WORK' is not writable!
Couldn't determine if table SUPT_INVT_BASE_FACT_WORK is writable.
Cannot prepare target table.
* * * * * * *
I guess that this might be a bug in Datastage. It is possible that the guy who wrote that module for Datastage actually determined to check if the logon user is the same as the table creator and if not gives that FATAL error message. I don't believe that DS would (and should) deal with Teradata securities. The DS software should just pass on the login id, password, domain name and the query. By the content of that message, I really wonder why DS cares if the table is writable or not. DS should just write to the table and if there is a permission problem, return the error message. However, unless it is confirmed that it is a bug with the Datastage software and it is not the fault of my code, there is no way I can explain it to others, right? I am stuck!
More importantly, if it is a bug, are there any workarounds other than using multiple instances of Server Job as Ray suggested? I thought of writing the data to flat files then upload them using Teradata Fastload, but at the UAT stage, I am not in a position to do structural changes.
Any more ideals? Suggestions?
Many thanks to you.
Regards,
Bill
The stuff has gone to UAT this morning. I worked from 9:15am to 11:00pm last night and from 4:30am this (Saturday) morning until now and god knows till when I can go home...
I don't think it is an option to go back to the Server Jobs.
Nrevezzo,
I do use qualifying table names, etc. I am not so sure if it is a problem with security rights, because with Queryman, BTEQ, and even Datastage Server job I don't experience that problem. It is only with a Parallel job that this problem occurs.
* * * * * * *
TOR_TABLE: Table 'SUPT_INVT_BASE_FACT_WORK' is not writable!
Couldn't determine if table SUPT_INVT_BASE_FACT_WORK is writable.
Cannot prepare target table.
* * * * * * *
I guess that this might be a bug in Datastage. It is possible that the guy who wrote that module for Datastage actually determined to check if the logon user is the same as the table creator and if not gives that FATAL error message. I don't believe that DS would (and should) deal with Teradata securities. The DS software should just pass on the login id, password, domain name and the query. By the content of that message, I really wonder why DS cares if the table is writable or not. DS should just write to the table and if there is a permission problem, return the error message. However, unless it is confirmed that it is a bug with the Datastage software and it is not the fault of my code, there is no way I can explain it to others, right? I am stuck!
More importantly, if it is a bug, are there any workarounds other than using multiple instances of Server Job as Ray suggested? I thought of writing the data to flat files then upload them using Teradata Fastload, but at the UAT stage, I am not in a position to do structural changes.
Any more ideals? Suggestions?
Many thanks to you.
Regards,
Bill
Regards,
Bill
Bill
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Software Development LifeCycle
- UAT fails.
Job goes back to developers.
Structural changes are possible (necessary?).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Hello Forum,
I have also experienced this error while using the Teradata Enterprise stage. I solved by explicitly granting an insert privilege to the table in question.
Interesting part was, that even while getting this error, I was able to load using an Teradata API using same ID and password.
I'm guessing that this is because API and Enteprise loaders differ in how they check for privileges prior to inserting data. The ID I was loading with was granted privileges on the Database level, but the Teradata Enterprise stage probably does a incorrect query on the rights table looking for table level privileges. API loader probably does no check.
- Greg
I have also experienced this error while using the Teradata Enterprise stage. I solved by explicitly granting an insert privilege to the table in question.
Interesting part was, that even while getting this error, I was able to load using an Teradata API using same ID and password.
I'm guessing that this is because API and Enteprise loaders differ in how they check for privileges prior to inserting data. The ID I was loading with was granted privileges on the Database level, but the Teradata Enterprise stage probably does a incorrect query on the rights table looking for table level privileges. API loader probably does no check.
- Greg
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Enhancement request is tracked under 87666.ray.wurlod wrote:Good feedback, thanks.
Is someone going to report this anomaly (and problem) via their support provider? If so can you please post their response here?
Their response is that if the following query returns no rows, then the table is considered not writable from Teradata Enterprise stage point of view:
Which means if you grant privileges on the database level or grant roles insert rights, and then assign a particular id that role, it is not good enough.select * from dbc.userrights where databasename = 'DBName'
and (tablename = 'TableName' or tablename = 'all') and accessright = 'I'
The suggested workaround is to grant insert access directly to the userid, or to grant insert access to all tables in the database using the ALL keyword.
Is there a way to ask DS NOT TO check the rights before insertion ?gsherry1 wrote:Which means if you grant privileges on the database level or grant roles insert rights, and then assign a particular id that role, it is not good enough.select * from dbc.userrights where databasename = 'DBName'
and (tablename = 'TableName' or tablename = 'all') and accessright = 'I'
Some APT_TERA option maybe ?