Search found 4992 matches

by kcbland
Thu Dec 15, 2005 9:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Path of job log files
Replies: 7
Views: 7020

Logs are hashed files within the project directory. You really need to consider stopping this train of thought because the log information is not stored in a manner that is relational. The log for a job is dependent on its job number, which changes whenever you copy or import a job. Please accept th...
by kcbland
Thu Dec 15, 2005 9:17 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DataStage 5.2 with Oracle 9i
Replies: 2
Views: 779

Your dsenv file contains paths to the Oracle home and shared libraries. You need to install the 32BIT Oracle client and update the dsenv file. Recycle the DS services and you should be fine.
by kcbland
Wed Dec 14, 2005 10:00 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Call DSExecute
Replies: 3
Views: 1853

kcbland wrote:Worst case, write the arguments to a text file and use indirection to feed the arguments to the script.
by kcbland
Wed Dec 14, 2005 9:33 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error Handling of ExecSH and DSexecute
Replies: 3
Views: 1163

If used in before/after job routine, you can set under the job properties that non-zero return codes are treated as fatal errors. Have your shell script return the appropriate value.
by kcbland
Wed Dec 14, 2005 9:02 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: problem with lookup
Replies: 24
Views: 5393

Suggestion #1, turn your job around and use the reference as the driver table. Rather than using a 1 to many processing design, use a many to 1 processing. If your reference table sets the maximum number of rows you are allowed, then it should be the driver. You won't be doing a multi-row lookup any...
by kcbland
Wed Dec 14, 2005 8:56 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Call DSExecute
Replies: 3
Views: 1853

The problem is a common one. The issue is that your command is being stripped of the double quotes. The DSExecute API has issues when you try to do what you're doing. The recommendation is to use a shell script that does all of the steps that you need, with the variable information passed as argumen...
by kcbland
Wed Dec 14, 2005 5:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error viewing log, Computed blink does not match expected
Replies: 17
Views: 11499

Your log is corrupted, maybe because of it exceeded 2.2 GB. You can try to recover the contents, but the easiest thing to do is clear it out completely. From DS Admin, connect to that project and issue "CLEAR.FILE RT_LOG1056". You'll lose the purge settings for the job, but at least the log (and job...
by kcbland
Wed Dec 14, 2005 5:18 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Updating Oracle_OCI_9
Replies: 2
Views: 723

Only put into the hashed file the rows necessary for processing on a given run. If your hashed file is coming from small table, just copy the table to the hashed file. But, if the table is huge, use meaningful information (probably from the source data being processing) to inner-join reduce the amou...
by kcbland
Wed Dec 14, 2005 5:16 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Deleting hashfile problem
Replies: 6
Views: 1910

Look at the "rmdir /S /Q \fullyqualifiedhashedfile" command to remove a hashed file directory and its contents, then "erase \basehashedfiledirectory\D_hashedfilename" to get rid of the D_hashedfilename file.
by kcbland
Wed Dec 14, 2005 4:54 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: problem with lookup
Replies: 24
Views: 5393

My constraint has columns from primary as well as from the reference If you can put an condition in the constraint that uses columns from the primary and reference links, then pass the primary link values as "keys" to the reference lookup and use the cursor variable number in the query as part of t...
by kcbland
Wed Dec 14, 2005 1:37 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: error
Replies: 2
Views: 1096

I'm sorry, I'm not tracking where this post originated. Is your original problem about saving a job? These error messages are explanatory, the first warns that the second could occur (and it did), what more do you need supplied?
by kcbland
Wed Dec 14, 2005 1:33 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: problem with lookup
Replies: 24
Views: 5393

Your query must be malformed if it's not returning 24 rows per lookup. Why isn't your constraint part of the WHERE clause?
by kcbland
Wed Dec 14, 2005 10:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: job being monitored eror
Replies: 11
Views: 2766

Someone somewhere has a Director Monitor open on that job. Start by exiting all Director sessions on your PC and try again. If that doesn't work, starting asking everyone to get out and try again. If every single person is out, then recycle the DS services as something is obviously wrong internally ...
by kcbland
Wed Dec 14, 2005 10:09 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: File missing
Replies: 2
Views: 560

You're positive you didn't reuse the same file name on multiple links? Otherwise, verify the path was valid, as it could have a file created with the directory as part of the name because of a non-existent directory. On a side note, you should have multiple output links from the ODBC stage, so that ...
by kcbland
Wed Dec 14, 2005 10:04 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error with dynamic hashfiles
Replies: 3
Views: 866

Research the forum for T30FILES and MFILES discussions, your problem is that the system was intantaneously out of ability to open/work with more dynamic hashed files.