Search found 4992 matches

by kcbland
Tue Mar 07, 2006 1:12 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How To Insert No.of Processed rows into Monitor Table
Replies: 6
Views: 1183

You can't. The API is DSGetLinkInfo and it must be CALLed from a Sequencer, DS Routine, DS Function, or Batch job. There's nothing in the before/after SQL to do this. You can use the before/after job routine call to achieve this, but you'll need to write a custom DS Routine do perform your necessary...
by kcbland
Mon Mar 06, 2006 4:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Nbr of DS projects
Replies: 11
Views: 4474

Projects, logs, status files, none of that matters. There's NO ram or cpu difference between 1 project of 500 jobs or 10 projects of 50 jobs. Disk consumption, yes, because of the overhead of the base files that make up a project.
by kcbland
Mon Mar 06, 2006 4:37 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: FTP File
Replies: 1
Views: 628

Did you use the same userid to run the job as did to login to the server? Look in the DS job log for any messages to confirm filename and destination directory.
by kcbland
Mon Mar 06, 2006 4:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Nbr of DS projects
Replies: 11
Views: 4474

The number of projects has NO bearing on cpu or memory usage. It is simply a "foldering" convention, like a database schema. In fact, a project is actually called an "account" in the DS Engine language, and an account is perfectly synonymous with a schema.
by kcbland
Mon Mar 06, 2006 4:22 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: multiple instances of job
Replies: 6
Views: 1238

sunshine wrote:so this is only for premium members.. :)

Nothing personal, just thought my second option is a good technique.
by kcbland
Mon Mar 06, 2006 2:57 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Equivalent Oracle TO_NUMBER function
Replies: 2
Views: 2096

Test and see. It's up to you to make sure the data is numeric if you're unsure to its quality. You'll want to verify the data type and precision prior to attempting to load it into the database. A library of data validation functions would sure be a good thing to have....hint hint... A good series o...
by kcbland
Mon Mar 06, 2006 2:52 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: multiple instances of job
Replies: 6
Views: 1238

Here's two methods I use: 1. Many individual like structured files, one job to process them. You can simultaneously execute the same job using instances giving a different input file name parameter value. 2. One large set of source data, use multiple job instances to divide a conquer process the lar...
by kcbland
Mon Mar 06, 2006 9:56 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Maximum export size for a DSX
Replies: 29
Views: 6524

whole bunch of constraint errors There are no job-routine-transform-table definition constraints in the Server product. We really need to know your error messages on import, but your efforts are really wasted. I'll let the others chip in to back me up on this. You can import objects in any order, t...
by kcbland
Mon Mar 06, 2006 8:35 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Maximum export size for a DSX
Replies: 29
Views: 6524

Multiple import simulteneously may agian eat up the memory. The problem is the "import selected". On large dsx files this is a fundamentally BAD METHOD. You basically wait while it "imports" the file into a selection list, then wait again while it rescans the dsx file and imports your chosen few. T...
by kcbland
Mon Mar 06, 2006 8:33 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Maximum export size for a DSX
Replies: 29
Views: 6524

There's no import order in a dsx file. Routines do not have to precede jobs. Trust me, I've used this product for 8 years. Steve and I wrote those perl scripts to facilitate interfacing DS with 3rd party version control tools. Since you're doing an import selected, you're manually choosing which job...
by kcbland
Mon Mar 06, 2006 8:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Maximum export size for a DSX
Replies: 29
Views: 6524

Instead of import selected search the forum for "dsx cutter". Steve Boyce posted a perl script (parsedsx) for exploding a .dsx into individual files for each job and routine in a large dsx file. He also posted a concatenator (catdsx) script for combining all files in a directory. My point is, explod...
by kcbland
Mon Mar 06, 2006 8:12 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Oracle OCI Load (Automatic Mode) Error
Replies: 31
Views: 9237

just wanted to highlight that the support to SQL Loader is to be discontinued True but most Oracle implementations world wide are still on versions 8 and 9 so it's a worthwhile discussion. Simply avoiding the existing preferred method for high performance loading because of a future change is a bad...
by kcbland
Fri Mar 03, 2006 7:52 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: What is the use of Field function
Replies: 3
Views: 1265

For other future requirements, your DS BASIC manual is available under your Start button in the Ascential folder. It's 1000 pages of documentation.
by kcbland
Fri Mar 03, 2006 7:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Oracle Upgradation
Replies: 2
Views: 976

On DS 7.5 you must use the OCI stage (what was OCI9 in prior DS versions) with the 9i client software to point to 10g. I'm not aware of an OCI10 stage, so you have no changes to make other than keep Oracle 9i 32 bit client installed and the DS dsenv file pointing to it.
by kcbland
Thu Mar 02, 2006 3:29 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Timestamp error
Replies: 3
Views: 854

When using the ORAOCI stages the SQL automatically puts TO_DATE on all DATE data type columns. Just make sure your date values are in the form your database NLS setting requires. In the USA, folks generally use YYYY-MM-DD (and if time is important add the time).