Search found 53125 matches
- Wed Apr 20, 2005 6:29 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Stored proceadure
- Replies: 4
- Views: 1231
- Wed Apr 20, 2005 6:25 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Can we call routens developed in server jobs in to parellel
- Replies: 2
- Views: 993
- Wed Apr 20, 2005 3:44 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Internal data error
- Replies: 2
- Views: 789
- Wed Apr 20, 2005 3:41 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unix TimeStamp Convert
- Replies: 19
- Views: 3953
- Wed Apr 20, 2005 3:39 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Export Hashfile
- Replies: 2
- Views: 583
- Wed Apr 20, 2005 3:36 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unable to connect to Datasatege Client (Director,Designer)
- Replies: 16
- Views: 6168
- Wed Apr 20, 2005 3:31 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: HELP! Run/Import/Export/ Jobs from UNIXserver otherthan DS
- Replies: 3
- Views: 1218
- Wed Apr 20, 2005 3:27 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Better for Surrogate Key Generation? Dstage or DB procedure
- Replies: 1
- Views: 1188
- Wed Apr 20, 2005 3:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: dsjob error in UNIX
- Replies: 8
- Views: 3113
If the $DSHOME/bin directory is in your PATH you can execute the dsjob command from anywhere. Or you can use the full pathname of dsjob, as illustrated below. But you do need the environment variables which are set in the dsenv script to have been set. . $DSHOME/dsenv $DSHOME/bin/dsjob -run -jobstat...
- Wed Apr 20, 2005 4:09 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: RT_CONFIG118.
- Replies: 4
- Views: 885
I'll assert that you can't open that particular job from any client. The RT_CONFIG118 (hashed) file contains run-time configuration information about job number 118. It is written to by compiling the job. For some reason DataStage is unable to open that particular hashed file. It may be permissions,...
- Wed Apr 20, 2005 4:02 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Datastage jobs cannot be run/execute
- Replies: 6
- Views: 1843
Read two documents, both of which you have with the software. One is the README. The other is the Installation and Upgrade Guide which you may have to search for on the clients CD, but it's not in any compressed file. Between them these contain absolutely everything you need to know, including the U...
- Tue Apr 19, 2005 9:12 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error calling subroutine
- Replies: 4
- Views: 1212
It's probably not necessary to reindex all tables, but it's quicker than wading through the repository database for the project trying to figure out which of them needs reindexing. If you'd told us which branch of the repository you had selected, and from which DataStage client, the answer probably ...
- Tue Apr 19, 2005 6:21 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: VOC
- Replies: 11
- Views: 3501
Some history
VMARK began in 1984, so this is the earliest possible date for UniVerse. UniVerse was written to mimic Prime INFORMATION, but to run on UNIX (whereas Prime INFORMATION ran only on the proprietary operating system PRIMOS). Prime INFORMATION resulted from work done by DEVCOM in the 1970's to provided ...
- Tue Apr 19, 2005 6:11 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Stored Proc error
- Replies: 1
- Views: 575
- Tue Apr 19, 2005 6:08 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: version control moved different job
- Replies: 8
- Views: 1278
Limit
While there is no documented limit on the number of versions, there may be physical factors that intervene. At one of my client's sites they ran into the "too many subdirectories in directory" limit on Solaris. This was caused by the huge number of dynamic hashed files in the Version project. The cl...