Search found 15603 matches
- Fri Jun 23, 2006 3:23 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unable to execute dsjob -- module libvmdsapi.so missing
- Replies: 5
- Views: 1594
- Fri Jun 23, 2006 3:21 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Extraction Job which one is faster?
- Replies: 3
- Views: 1357
- Fri Jun 23, 2006 2:48 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: writing log file after aborting job
- Replies: 2
- Views: 937
- Fri Jun 23, 2006 2:44 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Job Control fatal error (-14)
- Replies: 25
- Views: 13418
- Fri Jun 23, 2006 1:38 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: DB2/UDB Load Issues
- Replies: 4
- Views: 1857
- Fri Jun 23, 2006 1:37 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unable to execute dsjob -- module libvmdsapi.so missing
- Replies: 5
- Views: 1594
Are you sure that the user has Read access to that object? You can login to UNIX using that userid and attach to the $DSHOME directory and execute ". ./dsenv" {the initial . is important} and then issue the command "bin/dssh" to enter the TCL prompt. If the dsenv contains an error you will get the s...
- Fri Jun 23, 2006 1:34 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Modifying proteted jobs
- Replies: 4
- Views: 776
- Fri Jun 23, 2006 1:27 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Upgrade from DataStage 7.5 to DataStage 7.5.1.A
- Replies: 8
- Views: 3314
Copying the hashed files and other objects from one directory to another and/or from one machine to another is not always feasible. The most common problem that occurs is that any hashed file iwth a secondary index defined will need to be copied to exactly the same location since the indices are acc...
- Fri Jun 23, 2006 1:17 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Unable to export DS Project
- Replies: 6
- Views: 1883
- Thu Jun 22, 2006 4:26 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Renaming Files in DS
- Replies: 19
- Views: 3451
- Thu Jun 22, 2006 1:08 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Renaming Files in DS
- Replies: 19
- Views: 3451
It is taking the CONVERT function as a literal, and the "(" is triggering the error since that is not part of a valid path. Does the start loop activity accept @FM as a separator? If yes, then remove the CONVERT function. If no, then you need another stage where you do the CONVERT() function prior t...
- Thu Jun 22, 2006 12:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ASCII format showing some low characters
- Replies: 22
- Views: 5709
- Thu Jun 22, 2006 12:11 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ETL Structure
- Replies: 12
- Views: 3479
The logical model and physical table and column definitions are only one part of what you need in order to implement an ETL process. You also need to know your physical (and logical) source definitions for the data that will go into your target system {I'm avoiding using the term Warehouse or Mart h...
- Thu Jun 22, 2006 11:33 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ASCII format showing some low characters
- Replies: 22
- Views: 5709
You stated that you are creating this file. How (a DataStage EE Job, a script)? You can use some of the Transform stage functions in either EE or Server to manipulate characters. I think some of us are confused because you have said that you create the file but you want to remove "low characters" wi...
- Thu Jun 22, 2006 10:52 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Job Control fatal error (-14)
- Replies: 25
- Views: 13418
Hello again Klaus, are you certain about that patch? I am at a site where they had this problem and I was told here that there was no workaround for this issue; there were some configuration changes done to allow more processes in total but that the 60 second limit was hardcoded in several places an...