Search found 53125 matches
- Sat Mar 19, 2005 2:07 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Job Control routine
- Replies: 6
- Views: 1310
- Fri Mar 18, 2005 11:07 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: How to replace a string in Transform stage?
- Replies: 3
- Views: 1006
Welcome aboard! :D I agree with Ken. Chances are that your routine code could be streamlined, for example with judicious use of the Ereplace() function. You can post your code here if you'd like us to look at it, and it does not violate any non-disclosure agreement you may have in place. If you do, ...
- Fri Mar 18, 2005 11:04 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error with DRS Stage
- Replies: 6
- Views: 3879
I have seen this kind of disconnection occur with the DRS stage when an attempt is made to access an unsupported data type. In my case, it was a CLOB in Oracle. Changing the column derivation to CAST(columnname AS VARCHAR2(4000)) resolved the error. Nonetheless, it ought not to break the connection;...
- Fri Mar 18, 2005 11:01 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Job Control routine
- Replies: 6
- Views: 1310
- Fri Mar 18, 2005 10:59 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Access to sys.dba_extents required but not available.
- Replies: 7
- Views: 4945
- Fri Mar 18, 2005 3:38 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Job Control routine
- Replies: 6
- Views: 1310
- Fri Mar 18, 2005 3:26 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Hash File Lookups
- Replies: 3
- Views: 649
Your requirement is not something that a hashed file is geared to do; the whole raison d'etre of a hashed file is to return THE row whose key is given. You would not even be able to populate your proposed hashed file as you wish; you would end up with one row for each key value; the most recently wr...
- Fri Mar 18, 2005 3:18 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: invalid identifier
- Replies: 12
- Views: 5887
- Fri Mar 18, 2005 3:15 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Access to sys.dba_extents required but not available.
- Replies: 7
- Views: 4945
Do you have the Oracle CLIENT software installed too? At run time DataStage is a client to the Oracle server, so the Oracle client software must be installed on the DataStage machine, and properly configured, even if the Oracle server is on the same machine. Make sure also that you are using the 32-...
- Thu Mar 17, 2005 1:59 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Multiple Links Collection
- Replies: 10
- Views: 2021
- Thu Mar 17, 2005 1:54 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: version control moved different job
- Replies: 8
- Views: 2567
- Thu Mar 17, 2005 1:49 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Importing a sequential file and treating like a table
- Replies: 11
- Views: 5010
Yes, stop even believing that hashed files connect to anything. They won't help you to connect to Sybase. Hashed files are nothing more than an implementation mechanism of a database table, using in databases such as UniVerse, UniData, D3 and others. And, of course, DataStage Engine. You need either...
- Thu Mar 17, 2005 5:28 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: problem with scheduling a job
- Replies: 3
- Views: 942
The message tells you exactly what's wrong. You are not authorised to use the at command on the UNIX server. Contact your UNIX administrator; you may need to be added to cron.allow, or you may need to be granted write access to the atjobs and/or cronjobs directory (or added to a group that has such ...
- Thu Mar 17, 2005 5:25 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Importing a sequential file and treating like a table
- Replies: 11
- Views: 5010
No, what I meant was that you should define an ODBC data source name (DSN) that uses the ODBC driver for text files to connect to the text file in question. This is one of the drivers supplied in the branded_odbc directory; consult the help there to determine which one (I don't have access to UNIX d...
- Thu Mar 17, 2005 5:18 am
- Forum: General
- Topic: automatic clearing job logs by routine?
- Replies: 4
- Views: 1921