Search found 52 matches

by sivatallapaneni
Mon Jan 11, 2010 1:45 pm
Forum: General
Topic: Metadata repository Upgrade
Replies: 2
Views: 1436

Thank you very much for the reply. We are just upgrading the database and port number is not going to change.
by sivatallapaneni
Mon Jan 11, 2010 12:56 pm
Forum: General
Topic: Metadata repository Upgrade
Replies: 2
Views: 1436

Metadata repository Upgrade

Currently we are running Information Server 8.1 and MetaData repository is on Oracle version 10.2. We have plans to upgrade our oracle databases to 11g since oracle is stopping security patches. My question is did anybody in here upgrade their Oracle Metadata repository for Information server? Is 11...
by sivatallapaneni
Tue Sep 22, 2009 7:16 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with Polish characters in Paralle job
Replies: 6
Views: 2838

I tried this and polish characters are not coming right. I opened case with IBM and they said I need to check with my source system because I'm getting hex characters 9A and 9B. I know they are not going to give me a solution, We do have a work around, it is to read the file in server job using UTF8...
by sivatallapaneni
Tue Sep 15, 2009 7:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with Polish characters in Paralle job
Replies: 6
Views: 2838

Database it self is set to UTF16, But the stage in the job has project default ASCL_ISO8859-2 NLS map.
by sivatallapaneni
Mon Sep 14, 2009 5:02 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with Polish characters in Paralle job
Replies: 6
Views: 2838

Problem with Polish characters in Paralle job

I have a problem with one of the job that has CFF stage as source and oracle stage as target. The EBCDIC file has polish characters. NLS is installed on the server and Paralle maps are set to project default ASCL_ISO8859-2 character set which support polish characters. CFF Stage and oracle stage has...
by sivatallapaneni
Thu May 04, 2006 10:45 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Defining A Project
Replies: 7
Views: 2715

Defining A Project

Hi guys what are the PROS and CONS of having single big project. Why I am asking is in DS we cant maintain common objects, cant have global variables (Cause they are variables only at the project level). My client is using DS for the first time and there question is why cant we have a single big pro...
by sivatallapaneni
Fri Oct 21, 2005 10:52 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading file from different folder
Replies: 2
Views: 1256

Are you using the same file handle or different?
You might have to close the first handle if the file handle name is same.
by sivatallapaneni
Wed Oct 12, 2005 11:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: sequence: Error calling DSRunJob(), code=-2
Replies: 5
Views: 7410

I think he is hitting the T30 file limit. Sumeet do you have lot of jobs running during that time?

Can you check the uvconfig file for the "T30FILE" entry and see what is the value. The uvconfig file is in the .dshome directory.
by sivatallapaneni
Wed Oct 12, 2005 10:41 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: nullable fields in hash file and oraoci stages
Replies: 3
Views: 2275

I dont think you can have null fields as key in a hash file. At least the hash file wouldn't allow the null key fields.

I'm not sure about ODBC lookup. My guess is it is ok to have.
by sivatallapaneni
Wed May 04, 2005 6:11 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ds_uvput() - Write failed for record id '4130
Replies: 3
Views: 2326

I do have warning level set on the job before i ran. I cleared the log file and tried every thing.

About this 2Gb limit: I just have 30,000 records. If I'm hitting 2GB limit then i have to resize the hash file right? RESIZE <HASHFILENAME> is the command, is that correct.
by sivatallapaneni
Wed May 04, 2005 3:12 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ds_uvput() - Write failed for record id '4130
Replies: 3
Views: 2326

ds_uvput() - Write failed for record id '4130

Hi every one, I have a problem with one of the job. It has three hash files, referencing one hash file and writing to TWO hash files. when i run this job it is giving ds_uvput() - Write failed for record id '4135 warning. when i try to access the hash file from UV it giving me the following message ...
by sivatallapaneni
Thu Apr 14, 2005 5:05 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage Engine StartUp error ..
Replies: 5
Views: 4661

Before bringing down the services i killed all the orphaned and hanging processes and then brought down DS Engine. There were no connection open at that time, The command output was nothing. Then i tried to restart, it gave me that error. When i tried to cat /.dshome it said the variable is not foun...
by sivatallapaneni
Thu Apr 14, 2005 2:10 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage Engine StartUp error ..
Replies: 5
Views: 4661

DataStage Engine StartUp error ..

Hi all,
When i try to restart the DataStage Engine it is giving me this error,

Code: Select all

Message[UVI0001]
Did anybody come across this? Is there a solution for this?
Appreciate any help.

Thank you,
Siva.
by sivatallapaneni
Mon Jan 24, 2005 4:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: PX job failure ..
Replies: 7
Views: 2589

Do we need to have 9i client installed on the datastage server to get the PX jobs to work.

on our dev box we have mutiple oracle client including 9i. but the oracle_home is set to 8.1.7 on both DEV and TEST datastage servers.