Search found 53125 matches
- Wed Aug 20, 2003 4:19 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Importing table defifintions
- Replies: 4
- Views: 781
When you appreciate that ODBC is, more than anything, a set of standards, you begin to accept what it does. If you connect to Oracle via ODBC, then it is the ODBC driver that converts from the exposed data type (e.g. Decimal or TimeStamp, which ODBC-compliant drivers use) to the actual data type (e....
- Wed Aug 20, 2003 1:44 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Dict building
- Replies: 6
- Views: 1475
- Tue Aug 19, 2003 6:39 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: why use Datastage
- Replies: 18
- Views: 1300
I second what Kim and Ken had to say, but would like to address your original question (why use DataStage) from a slightly different perspective, that of maintenance into the future. Because DataStage automatically manages the metadata, and includes the ability to import "table definitions" from tex...
- Tue Aug 19, 2003 6:28 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Please Help: Running a job cyclically till success
- Replies: 2
- Views: 476
I can not see in this code where the value of COL1 used in generating the SELECT statement comes from. This will cause an unassigned variable condition which will propagate. You also seem confused about capturing the output. When you run SQLFetch, the value (the count of rows) will be assigned to th...
- Tue Aug 19, 2003 6:08 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Dict building
- Replies: 6
- Views: 1475
Given that you did not specify a separator character the default is a text mark (accessible through the system variable @TM). Expression to construct the key value is therefore: keyfield1 : @TM : keyfield2 : @TM : keyfield3 You also need three I-type expressions in the file dictionary to specify the...
- Tue Aug 19, 2003 6:01 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Job Paramters from a common parameter file
- Replies: 4
- Views: 1235
Yeah, you could, but why would you? The sequential I/O statements in DataStage BASIC are easy to understand, and sample code has been posted right on this site, so you don't have to go outside your DataStage environment. If you run a shell script you still have to parse its output, not to mention ca...
- Tue Aug 19, 2003 5:50 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Performance Issue
- Replies: 5
- Views: 782
You can isolate the cause to some extent by making a copy of the job that writes to a sequential file rather than to Oracle. Any difference in throughput is down either to Oracle or the interface to it (stage type and/or client software). Also be aware of anything that may be happening in the Oracle...
- Tue Aug 19, 2003 4:03 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Promotion of Jobs into test and production
- Replies: 8
- Views: 1945
After you deploy a job into test or production, you can change the default values for parameters in Director. This is the approach I would recommend, irrespective of what stage type you are using. Make the DSN, user, password, and so on job parameters and set suitable - perhaps different - default v...
- Tue Aug 19, 2003 3:58 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Sizing of Hash File
- Replies: 4
- Views: 552
Note that Distributed files can not take advantage of the hashed file cache. Using a Distributed file (a collection of hashed files whose metadata is identically defined) does not permit the use of hashed file cache due to the overhead of needing to calculate the partitioning algorithm. But don't le...
- Mon Aug 18, 2003 8:49 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Buffering values in Stage Variables
- Replies: 12
- Views: 2653
- Mon Aug 18, 2003 8:46 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Sizing of Hash File
- Replies: 4
- Views: 552
Hashed files can be created with 64-bit addressing, which gives a theoretical upper limit of 9 million TB for the maximum size of a hashed file. Provided your operating system supports 64-bit addressing you can make use of this feature. The Hashed File Calculator (in the unsupported utilities on you...
- Sun Aug 17, 2003 9:51 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Error when reading a file
- Replies: 2
- Views: 1086
That the error is being generated from the dsipcopen() function suggests either that you've got an IPC stage in a server job (and it's that stage that's having the problem), or that you're using, and something has gone awry with, row propagation. If you let us know which it is, maybe we can diagnose...
- Sat Aug 16, 2003 5:24 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Web Version for DataStage
- Replies: 16
- Views: 1352
Despite the fact that you're planning to lodge your work on Source Forge, I doubt that you'll get much help without some hope of financial reward for the helper(s). You should also check out Ascential's web site, to learn about Web Services (in release 7, due any day now). I have not been able to re...
- Sat Aug 16, 2003 5:16 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: sequential reading from hash file
- Replies: 3
- Views: 765
A slightly technical addendum to Kim's post. Kim's example code uses what is called "the default Select List", or Select List number 0. I have found on occasion that this clashes with DataStage's own use of this Select List, particularly if running from the Debugger. Therefore I always use one of th...
- Sat Aug 16, 2003 5:04 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Sensing for a particular field value in source
- Replies: 13
- Views: 1264
Roy, My reading of the original post is that he wanted some condition to apply before kicking off the ETL job. To be sure, the determination could be performed by an earlier DataStage job all under the control of a job sequence of job control routine, but the technique of determining that the requis...