"Failed to open RT_LOG6305 file." suggests that the log (for job number 6305) has become corrupted. Try clearing the log before proceeding to try to run the job again. To find out which is job number 6305
Is this on a single (SMP) machine or multiple machines? The error has occured on the first node in the configuration file, and is a failure to write to a pipe. This most usually occurs because the process reading from the pipe is not draining it as fast as the writing process is writing to it, leadi...
Rubbish. You can perform a lookup using the text file as stream and the table (or a hashed file containing appropriate columns from it) to feed the reference input. You don't have to join.
Ask your support provider to (ask IBM support to) VLIST the DSD.OshRun program to find out what is happening at address 1ec8. This may help to determine the likely cause and/or whether there's a bug.
Find the table definition for the Sequential File in Manager. Select it. Click Usage Analysis. Provided you have been consistent with managing metadata (only loading into jobs, never changing within jobs) this will show all jobs that use that table definition. You can also use MetaStage to perform a...
Appending a line to a text file is WAY faster than upserting a row into a database table, and WAY WAY faster if that table has constraints to be checked and/or indices to be maintained. Hence my suggestion to use files for staging. (You can also use these as data files for bulk loading.)
Welcome aboard. :D The default configuration file consists of two nodes, which you can call Node0 and Node1 (or anything else you like). The fastname is the name of the machine where DataStage is installed. Each node is only in the default node pool only. Each has one (the same) disk and scratch dis...
Read Chapter 2 of Parallel Job Developer's Guide (the very end of the Chapter) for restrictions on using server job shared containers in parallel jobs.
You can also use the BASIC Transformer (effectively the server Transformer) stage in parallel jobs, except to perform lookups.
Hashed file only does a whole-key match.
The solution to this can only be a grep for each name on the input stream, given that the "lookup file" is a file. If it's a database table, then a "LIKE" lookup might be possible, but via a Hashed File stage no.