Search found 53125 matches

by ray.wurlod
Mon May 01, 2006 2:18 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_IMPORT_PATTERN_USES_FILESET
Replies: 6
Views: 5804

I've seen this error in the "101" class, where someone put the name of a Data Set control file (blah.ds rather than blah.fs, where there really was a Data Set controlled by blah.ds) in a File Set stage.
by ray.wurlod
Mon May 01, 2006 2:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem in Job Report XML File Creation
Replies: 4
Views: 996

Is the second node being reported separately somewhere else in the XML document? If not, I suspect you've uncovered a bug. What does your support provider suggest?
by ray.wurlod
Mon May 01, 2006 2:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: delete object
Replies: 3
Views: 1290

Problem is the second approach is knowing exactly what to delete. If there's an entry in DS_JOBS there may be entries in lots of other tables.
by ray.wurlod
Mon May 01, 2006 2:12 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Running PX on windows
Replies: 9
Views: 3126

You do need a C++ compiler. Typically the one chosen is in Visual Studio .NET. The only fully parallel-capable version of DataStage for Windows (currently) is 7.5x2; you must have this version. To "connect" the compiler to DataStage you specify the compiler and linker information via the environment...
by ray.wurlod
Mon May 01, 2006 12:46 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Possible to split a flat file into single file/row?
Replies: 7
Views: 1903

Main job reads the flat file, one row at a time, and passes that row through a Transformer stage. Output can also go to a text file. Within the Transformer stage of the main job you build the required arguments for, and invoke, UtilityRunJob. Pass it the name of the other (subordinate) job, the one ...
by ray.wurlod
Mon May 01, 2006 12:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: delete object
Replies: 3
Views: 1290

Get to a command prompt, such as the Administrator client Command window, and issue the command DS.CHECKER - this checks for incompletely deleted jobs and deletes orphaned components. This ought to be enough to cure your current dilemma.
by ray.wurlod
Sun Apr 30, 2006 11:07 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Possible to split a flat file into single file/row?
Replies: 7
Views: 1903

A job sequence is nothing more than a server job that contains only a job control routine. The answer, therefore, is yes.
by ray.wurlod
Sun Apr 30, 2006 2:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reading comp-3 field from AS400 with Ascential driver
Replies: 17
Views: 5310

That's a heck of a large number!!! How much of the field is actually used? Have you checked out the data type conversion routines in the SDK?
by ray.wurlod
Sat Apr 29, 2006 2:39 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: job Completed with out creating all hash files
Replies: 6
Views: 1468

I wondered if they were all the same hashed file, even though four separate stages. The fourth create would possibly wipe out any rows thus far sent (if delete before create has been checked). Automatic creation of hashed files is a fairly recent phenomenon (7.5?). The OP only specified 7.x - I'm no...
by ray.wurlod
Sat Apr 29, 2006 2:34 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_BUFFERING_POLICY
Replies: 4
Views: 2017

Thanks, Roy. However APT_BUFFERING_POLICY is already in DSParams, with legal list values and a legal default value. This is what's so confusing!
by ray.wurlod
Sat Apr 29, 2006 1:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Auditing or statistic table
Replies: 2
Views: 1106

Process metadata, which is what you describe, is automatically captured into the job's log (from which it can be retrieved using the DSGetLog... functions in the DataStage API - that is, in a routine). Process metadata may also automatically be captured by MetaStage, provided that the job is running...
by ray.wurlod
Fri Apr 28, 2006 5:34 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Possible to split a flat file into single file/row?
Replies: 7
Views: 1903

Create one job that reads that flat file.
In a Transformer stage call the UtilityRunJob routine to run the other job, passing ID and element (and anything else required) as job parameters.
by ray.wurlod
Fri Apr 28, 2006 4:17 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Routine Variable
Replies: 5
Views: 1488

There are five system variables called @USER0 through @USER4 and another called @USER.RETURN.CODE. Each is initialized to zero automatically when your job starts. You can use any of these, thereby avoiding the need to use COMMON (and leaving open the possibility, therefore, of later migration into a...
by ray.wurlod
Fri Apr 28, 2006 4:15 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: job Completed with out creating all hash files
Replies: 6
Views: 1468

Welcome aboard.

What messages were logged? In particular, were there any warnings? Did you check the "create file" check box in the Hashed File stage for each link?
by ray.wurlod
Fri Apr 28, 2006 4:10 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Substring problem
Replies: 12
Views: 3095

Look in the DataStage BASIC manual for two subroutines that work with pathnames. They are called, if my memory serves, !GET.PATHNAME and !MAKE.PATHNAME

By using these to deconstruct and construct pathnames you will have a completely portable and idiot-proof mechanism.