Search found 53125 matches

by ray.wurlod
Mon Dec 04, 2006 5:22 pm
Forum: IBM<sup>®</sup> SOA Editions (Formerly RTI Services)
Topic: Webservices PACK and RTI differenced...
Replies: 10
Views: 5037

Need a memory upgrade. Hmm.
by ray.wurlod
Mon Dec 04, 2006 5:19 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Runaway osh processes
Replies: 9
Views: 2749

Maybe. Can you post the code generated by the Transformer stage? It's in a subdirectory called RT_SCnnn in your project, where nnn is the job number.

Code: Select all

SELECT JOBNO FROM DS_JOBS WHERE NAME = '<<Job name>>';
by ray.wurlod
Mon Dec 04, 2006 5:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Spreadsheet
Replies: 6
Views: 1368

ODBC is OK, but you have to remember that each Worksheet is visible to the ODBC driver for Excel as a system table. Worksheets must be in tabular form, with "column names" in row $1:$1. Of course, you will also need to find (and possibly license) an Excel ODBC driver for UNIX to be able to use this ...
by ray.wurlod
Mon Dec 04, 2006 5:11 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Installing parallel canvass in standalone system
Replies: 6
Views: 1357

This means that you are not running version 7.5x2 which is the ONLY Windows-based version (until 8.0 is available) on which parallel jobs can be run. You have only one solution available if you want to run parallel jobs (or compile parallel jobs that contain Transformer or Build stages), and that is...
by ray.wurlod
Mon Dec 04, 2006 5:09 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Sparce error
Replies: 3
Views: 1146

I wonder if it's an issue with the Data Direct ODBC licence? According to the Parallel Job Developer's Guide manual and IBM's training classes, sparse lookup is only supported for DB2/UDB and Oracle Enterprise stages. So you may have exhausted the grace period on the licence for your ODBC driver.
by ray.wurlod
Mon Dec 04, 2006 5:04 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sequencer : Job not visble when under 2 levels down....
Replies: 4
Views: 1539

Is your operating system Windows XP Pro with service pack 2 and, if so, do you have the DataStage patch for this combination installed?
by ray.wurlod
Mon Dec 04, 2006 5:02 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Socket closed error, when parallel jobs run in sequence
Replies: 4
Views: 3325

Was Sybase shut down or quiesced before or while this job was running?
by ray.wurlod
Mon Dec 04, 2006 4:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Mutex Timeouts
Replies: 5
Views: 1536

Maybe your computer is too fast! It could be exhausting the number of retries before the timeout kicks in. Search the Forum for information about the SPINTRIES and SPINSLEEP configuration variables that can be used to tune mutex lock behaviour. You might also increase the timeout on your IPC stages.
by ray.wurlod
Mon Dec 04, 2006 4:55 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: regarding query repository
Replies: 8
Views: 1822

Which command line?

What precisely is the error message?

Did you make any typing errors when entering the query? It may be that the query environment has not been advised that the backspace key is also the erase key.
by ray.wurlod
Mon Dec 04, 2006 4:53 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: regarding query repository
Replies: 7
Views: 1667

Some specific commands are blocked in the Administrator client. You can successfully executed them from a telnet session. I didn't think that LIST DICT was one of them, however. It may be that the particular table name is blocked - that would be easily gotten around by defining a synonym. You did no...
by ray.wurlod
Mon Dec 04, 2006 4:48 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Capturing non-matched records from reference file
Replies: 5
Views: 1072

It's impossible to capture the records that were not lookup up when performing a lookup, unless you immediately delete the successes from the reference table (hashed file) and inspect the survivors afterwards. In SQL you might use a NOT IN comparison to a subquery that included all your matches. Doi...
by ray.wurlod
Mon Dec 04, 2006 4:44 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: analyzing universe
Replies: 3
Views: 1103

It's not UniVerse, it's DataStage, as noted elsewhere. DS_JOBOBJECTS has three key columns, OBJTYPE, OBJIDNO and OBJNAME OBJTYPE is a single letter, either "J" for job or "C" for container OBJIDNO is the job number, which maps to DS_JOBS.JOBNO or DS_CONTAINERS.CONTAINERNO so that job/shared containe...
by ray.wurlod
Mon Dec 04, 2006 4:34 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: regarding universe utility
Replies: 3
Views: 907

DS_JOBOBJECTS is not a UniVerse table, it's a DataStage table. It has different record structures, some with up to 216 columns. The vendor has chosen not to reveal the structure of those records via column definitions. Therefore, without "hacking" (which violates the clause in your licence agreement...
by ray.wurlod
Mon Dec 04, 2006 4:29 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Scheduling
Replies: 3
Views: 1200

Ask your UNIX administrator about at and cron. Jobs just for today are scheduled using at, while repeats are scheduled using cron. You may be permitted to use at but not to use cron. The UNIX administrator controls this through files called at.allow and cron.allow respectively.
by ray.wurlod
Mon Dec 04, 2006 4:26 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help with "timestamp" for OCI-9
Replies: 41
Views: 11900

Without prior deletion you can only ever INSERT a particular key once into an Oracle table that has a primary key defined. Perhaps you need to revisit the design of that table if there are to be more than one run with the same process date. Or maybe you're just not generating the seconds part of the...