Never mind, I just went to a different project directory and ran the command and it was fine. The project did not exist in the LIST UV.ACCOUNT listing. Ray, I can log into the XMETA database, but my normal id has no select privileges. What id would normally be used for this type of cleanup activity?...
The schema is still there, this could be the result of a partial delete, i.e. someone trying it from the (undocumented) commands in DS. Open up an admin or TCL prompt, then enter "LIST UV.ACCOUNT", do ... Okay, I am trying to use $DSHOME/bin/uv and get library errors (even after sourcing ...
We are trying to create a new project in DS8, but are getting errors that it already exists. Obviously we must have tried to build this at some point. The problem is that we cannot find it anywhere. The Projects directory has no directory for it, and it is not listed when I run the dsjob -lprojects ...
I am trying to create a job that reads in a dataset and creates/replaces a target table on Teradata based on the input dataset. I am usingthe Teradata Connector, but cannot find a place to specify the primary index. In the TD Enterprise stage, there was an option for this. Is there a comparable opti...
You may also check which EBCDIC codepage is being used at your host system and then set the environment variable APT_EBCDIC_VERSION accordingly. Klaus Any idea what APT_EBCDIC_VERSION is set to if it is not explicitly set? Where can I find it? I looked in Administrator at the project level settings...
We ran into an issue in our UAT environment and were not able to create projects. Turns out the DB2 repository ran out of space. That was resolved. However, the project that was being created when it failed is still out there.... sort of. If I look at the Projects directory, there was a directory wi...
Normally, I would recommend the remdup stage, but I don't know how it would deal with NULLs. You tell the remdup stage to keep either the first or last occurrence of a repeated record, but can it handle NULLs? If so, does it show up as low or high when sorted with dates or other datatypes? You may w...
Raja and others, Just wanted to keep everyone updated. This issue is resolved at our site after installing the patch from IBM. The issue is... "Character buffer size" was too small and the patch increases the buffer size to hold long long SQLs. Are you using one of the new connector stage...
We have been getting intermittent errors in all of our servers with the following message: tdread1: Error occurred during initializeFromArgs(). [api/operator.C:691] tdread1: Property : Property not valid in source context [pxbridge.C:563] The stage that fails is a Teradata Connector doing a read, ac...
We are getting a strange error in a simple DataStage v8 job. We are on DataStage 8.0.1 and Teradata V2R6.2. The job reads from a dataset and uses the TD Connector to load to a target table (bulk load). work_dly_loc_stmt1,12: [IIS-CONN-TERA-005027] Unable to issue a checkpoint at 133,167 rows because...
hi, Thanks that it worked Fine!!!!! But for command ./dsjob -run <proj_name> <jb_name> its throwing DSJE_REPError. is it mean its failed reaching Repository? when i researched it says getLasterrormessage for proceeding. How could i resolve this issue? Please Help me!!! I just got this same error me...
You unpack a decimal in the column import the same as you would with an import. Your record should already be specified as binary EBCDIC. Set your import field to Decimal with the approriate length and scale and then set Packed to Yes (Packed is an option for the Decimal datatype).