Search found 4992 matches
- Thu Dec 15, 2005 9:25 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Path of job log files
- Replies: 7
- Views: 7020
Logs are hashed files within the project directory. You really need to consider stopping this train of thought because the log information is not stored in a manner that is relational. The log for a job is dependent on its job number, which changes whenever you copy or import a job. Please accept th...
- Thu Dec 15, 2005 9:17 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: DataStage 5.2 with Oracle 9i
- Replies: 2
- Views: 779
- Wed Dec 14, 2005 10:00 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Call DSExecute
- Replies: 3
- Views: 1853
- Wed Dec 14, 2005 9:33 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error Handling of ExecSH and DSexecute
- Replies: 3
- Views: 1163
- Wed Dec 14, 2005 9:02 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: problem with lookup
- Replies: 24
- Views: 5393
Suggestion #1, turn your job around and use the reference as the driver table. Rather than using a 1 to many processing design, use a many to 1 processing. If your reference table sets the maximum number of rows you are allowed, then it should be the driver. You won't be doing a multi-row lookup any...
- Wed Dec 14, 2005 8:56 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Call DSExecute
- Replies: 3
- Views: 1853
The problem is a common one. The issue is that your command is being stripped of the double quotes. The DSExecute API has issues when you try to do what you're doing. The recommendation is to use a shell script that does all of the steps that you need, with the variable information passed as argumen...
- Wed Dec 14, 2005 5:59 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error viewing log, Computed blink does not match expected
- Replies: 17
- Views: 11499
Your log is corrupted, maybe because of it exceeded 2.2 GB. You can try to recover the contents, but the easiest thing to do is clear it out completely. From DS Admin, connect to that project and issue "CLEAR.FILE RT_LOG1056". You'll lose the purge settings for the job, but at least the log (and job...
- Wed Dec 14, 2005 5:18 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Updating Oracle_OCI_9
- Replies: 2
- Views: 723
Only put into the hashed file the rows necessary for processing on a given run. If your hashed file is coming from small table, just copy the table to the hashed file. But, if the table is huge, use meaningful information (probably from the source data being processing) to inner-join reduce the amou...
- Wed Dec 14, 2005 5:16 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Deleting hashfile problem
- Replies: 6
- Views: 1910
- Wed Dec 14, 2005 4:54 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: problem with lookup
- Replies: 24
- Views: 5393
My constraint has columns from primary as well as from the reference If you can put an condition in the constraint that uses columns from the primary and reference links, then pass the primary link values as "keys" to the reference lookup and use the cursor variable number in the query as part of t...
- Wed Dec 14, 2005 1:37 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: error
- Replies: 2
- Views: 1096
- Wed Dec 14, 2005 1:33 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: problem with lookup
- Replies: 24
- Views: 5393
- Wed Dec 14, 2005 10:12 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: job being monitored eror
- Replies: 11
- Views: 2766
Someone somewhere has a Director Monitor open on that job. Start by exiting all Director sessions on your PC and try again. If that doesn't work, starting asking everyone to get out and try again. If every single person is out, then recycle the DS services as something is obviously wrong internally ...
- Wed Dec 14, 2005 10:09 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: File missing
- Replies: 2
- Views: 560
You're positive you didn't reuse the same file name on multiple links? Otherwise, verify the path was valid, as it could have a file created with the directory as part of the name because of a non-existent directory. On a side note, you should have multiple output links from the ODBC stage, so that ...
- Wed Dec 14, 2005 10:04 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error with dynamic hashfiles
- Replies: 3
- Views: 866