Search found 15603 matches

by ArndW
Fri Jun 23, 2006 3:23 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Unable to execute dsjob -- module libvmdsapi.so missing
Replies: 5
Views: 1594

Normally the permissions are set correctly, but you do need to have the appropriate libraries in the search path. You need to execute the "dsenv" file to set the environment correctly and that should be enough.
by ArndW
Fri Jun 23, 2006 3:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Extraction Job which one is faster?
Replies: 3
Views: 1357

William,

there is no single good answer to that. If your SQL logic includes a big reduction in the amount of data then often it is faster to do it in the database and save I/O. Sometimes the DB machine is more heavily loaded than the DS or ETL machine, so processing can be faster inside DataStage.
by ArndW
Fri Jun 23, 2006 2:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: writing log file after aborting job
Replies: 2
Views: 937

On of the options when writing to a sequential file is "cleanup on failure" and that defaults to "true". If you change that value the file won't be removed upon job failure.
by ArndW
Fri Jun 23, 2006 2:44 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Control fatal error (-14)
Replies: 25
Views: 13418

Klaus,

thanks for that information & gruesse right back! I'll pass on the information here to get the customer to request that patch; they are suffering from that issue and their workarounds are not permanent solutions.

Danke,
by ArndW
Fri Jun 23, 2006 1:38 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DB2/UDB Load Issues
Replies: 4
Views: 1857

Does that mean that the other methods do work and that load is the only one with this error?
by ArndW
Fri Jun 23, 2006 1:37 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Unable to execute dsjob -- module libvmdsapi.so missing
Replies: 5
Views: 1594

Are you sure that the user has Read access to that object? You can login to UNIX using that userid and attach to the $DSHOME directory and execute ". ./dsenv" {the initial . is important} and then issue the command "bin/dssh" to enter the TCL prompt. If the dsenv contains an error you will get the s...
by ArndW
Fri Jun 23, 2006 1:34 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Modifying proteted jobs
Replies: 4
Views: 776

I am assuming that this new project was not created or flagged by the administrator as "protected". If you loaded your jobs into the project using a .dsx export file then I would recommend going into that file with an editor and looking for lines that read Readonly "1" and changing them to...
by ArndW
Fri Jun 23, 2006 1:27 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Upgrade from DataStage 7.5 to DataStage 7.5.1.A
Replies: 8
Views: 3314

Copying the hashed files and other objects from one directory to another and/or from one machine to another is not always feasible. The most common problem that occurs is that any hashed file iwth a secondary index defined will need to be copied to exactly the same location since the indices are acc...
by ArndW
Fri Jun 23, 2006 1:17 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Unable to export DS Project
Replies: 6
Views: 1883

The fact that those two main files seem to be OK and not corrupted is a good thing! Go ahead and do the DS.TOOLS reindex when nobody else is in DataStage and chances are very good that your error will disappear.
by ArndW
Thu Jun 22, 2006 4:26 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Renaming Files in DS
Replies: 19
Views: 3451

Edward_m,

yes, if the startloop activity will accept @FM as a delimiter. I don't have access to a DataStage system at present so cannot check that.
by ArndW
Thu Jun 22, 2006 1:08 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Renaming Files in DS
Replies: 19
Views: 3451

It is taking the CONVERT function as a literal, and the "(" is triggering the error since that is not part of a valid path. Does the start loop activity accept @FM as a separator? If yes, then remove the CONVERT function. If no, then you need another stage where you do the CONVERT() function prior t...
by ArndW
Thu Jun 22, 2006 12:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ASCII format showing some low characters
Replies: 22
Views: 5709

If you are generating the in a EE job then this is the best place to get rid of them. You can do a CONVERT(CAHR(000),' ',In.ColumnName) in a EE transform stage (the syntax is the same as that for a server job).
by ArndW
Thu Jun 22, 2006 12:11 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ETL Structure
Replies: 12
Views: 3479

The logical model and physical table and column definitions are only one part of what you need in order to implement an ETL process. You also need to know your physical (and logical) source definitions for the data that will go into your target system {I'm avoiding using the term Warehouse or Mart h...
by ArndW
Thu Jun 22, 2006 11:33 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ASCII format showing some low characters
Replies: 22
Views: 5709

You stated that you are creating this file. How (a DataStage EE Job, a script)? You can use some of the Transform stage functions in either EE or Server to manipulate characters. I think some of us are confused because you have said that you create the file but you want to remove "low characters" wi...
by ArndW
Thu Jun 22, 2006 10:52 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Control fatal error (-14)
Replies: 25
Views: 13418

Hello again Klaus, are you certain about that patch? I am at a site where they had this problem and I was told here that there was no workaround for this issue; there were some configuration changes done to allow more processes in total but that the 60 second limit was hardcoded in several places an...