Search found 53125 matches
- Wed Jan 28, 2004 3:31 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Where are the jobs, routines,table definitions Etc!
- Replies: 7
- Views: 2444
Ascential regard the structure of the repository tables as intellectual property and have not published them. The commands that you can enter at the Command window in the Administrator client are, for the most part, UniVerse commands. Manuals for these can be downloaded from IBM's web site. However,...
- Wed Jan 28, 2004 3:25 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: How to run multiple instance using DSRunJob
- Replies: 2
- Views: 2510
- Wed Jan 28, 2004 3:23 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Aggregating using Cosort Plug In
- Replies: 1
- Views: 948
- Wed Jan 28, 2004 3:21 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Distributed Hashed Files
- Replies: 6
- Views: 1544
A major disadvantage of distributed files, from DataStage point of view, is that they do not benefit from memory caching, which means that your I/O will potentially be 1000 times slower or worse. (Of course, unless you have a very large system, you're not going to cache 3.6GB hashed files anyway.) A...
- Wed Jan 28, 2004 3:14 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Find Last Day of a Month
- Replies: 3
- Views: 1579
Re: Find Last Day of a Month
Can someone tell me how you can find the last day of a month? I have been trying to use the ConvertMonth function for testing purposes by using ConvertMonth("2004/12","L") but I keep getting an error message indicating invalid Month.TAG. Can you explain what the Month.TAG is? I don't think I have t...
- Wed Jan 28, 2004 3:11 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Loading a set of more than 100 Pipe delimited text files
- Replies: 6
- Views: 4172
Because the record layout is different, you're going to need 100 separate DataStage jobs. Your only other possibility is to create one generic job, that reads one "column" per row, and the world's most horrible parsing to handle the 100 different record layouts. Using the KISS principle, do the 100 ...
- Wed Jan 28, 2004 12:05 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Any other option to minimise look ups
- Replies: 4
- Views: 1117
Pre-load the lookup tables to memory. You can't do this with ODBC; you will need to use hashed files. Pre-allocate disk for the hashed files (use MINIMUM.MODULUS option, to get a value use the Hashed File Calculator on your DataStage CD). Loading the hashed files won't take long (minutes). Plus you ...
- Tue Jan 27, 2004 10:52 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: OpenSeq and ReadSeq not working in routine
- Replies: 11
- Views: 6359
However, why not do this with a DataStage job? Read the 13 columns and output one, containing line feeds and repeats of columns 1 and 2. This technique has been discussed in the past, do a search on the Forum. It's exceedingly fast. He's doing the reverse, 1 row becomes 11. So, a splitter style tra...
- Tue Jan 27, 2004 10:47 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: OpenSeq and ReadSeq not working in routine
- Replies: 11
- Views: 6359
Also, isn't there an issue with using GOSUB syntax within a Routine? I was under the impression that was a no-no as you tended to exit the routine when the first 'return' was encountered. Or that the presence of the multiple 'return' statements irritated the compiler... It is a known glitch in the ...
- Tue Jan 27, 2004 10:41 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Housekeeping hash files created in Account
- Replies: 8
- Views: 2683
The list you get via "import hashed file definitions" intentionally filters out those hashed files used for DataStage repository, so you're safe to proceed on that basis. All of the repository hashed files have names beginning with "DS_" or "RT_", or are "VOC" or "VOCLIB". If you make sure that your...
- Tue Jan 27, 2004 10:36 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: OpenSeq and ReadSeq not working in routine
- Replies: 11
- Views: 6359
You've written an infinite loop containing ReadSeq statements. That is, there is no exit from this loop. Try Loop While ReadSeq rfileline from rfilevar * statements Repeat However, why not do this with a DataStage job? Read the 13 columns and output one, containing line feeds and repeats of columns ...
- Tue Jan 27, 2004 10:29 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: bulk loading fixed width seq file into oracle table
- Replies: 2
- Views: 951
Generating the DAT file with a Sequential File stage will be much faster. Since you're not automatically invoking the bulk loader (sqlldr), you can have your CTL file pre-written - maybe adapt the one that DataStage has produced for you. Or you can have a job design that uses an ORABULK stage - to w...
- Tue Jan 27, 2004 10:22 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Peoplesoft agreement with Ascential
- Replies: 10
- Views: 2424
What is the relationship Datastage and IBM IBM gave Ascential a lot of money for their database. Ascential went, "Woohoo! We love you! Use our DataStage toys!" IBM get richer off Ascential's toys. -T.J. P.S. I'm sure I missed a few details here and there. Informix purchased Ardent for US$800M. IBM ...
- Tue Jan 27, 2004 10:18 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Basic Transformer in PX job?
- Replies: 3
- Views: 1517
There is a small performance overhead in using any additional software. Therefore it follows that there will be a small overhead in loading the BASIC engine. That's where the main overhead will be - actually loading the engine. A small extra overhead in process management, but PX is doing lots of th...
- Tue Jan 27, 2004 10:12 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Automated Restart?
- Replies: 5
- Views: 1599
A job is not permitted to attach itself (see help on DSAttachJob) and therefore can not re-start itself. To effect conditional re-start, the best solution (imho, of course) is to have detection of the file handled by a job control routine (or job sequence); it then triggers the actual job, which del...