Search found 7201 matches

by admin
Wed Mar 27, 2002 9:59 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

Thanks Galemmo "Galemmo,Nicholas,GLEN DALE,IS" To: "datastage-users@oliver.com" Subject: RE: Tuning Hash File Creation Techniques 03/27/2002 01:52 PM Please respond to datastage-users Set your group size to 2 on the options screen. That gives a 4096 byte bucket. Divide 800 into 4096 (are you sure yo...
by admin
Wed Mar 27, 2002 9:57 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

On your DataStage CD is an unsupported utility to calculate the parameters of hashed files. You provide the average record size, estimated number of records, and expected key pattern and it generates an appropriate CREATE.FILE or mkdbfile statement. Ray Wurlod Trainer - Asia Pacific Region IBM Infor...
by admin
Wed Mar 27, 2002 9:57 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

Thank you. I split the file into 4 files actually now. Thanks Gopalakrishnan Ganesan Rewards CDM Team 503-225-6023 "Galemmo,Nicholas,GLEN DALE,IS" To: "datastage-users@oliver.com" Subject: RE: Tuning Hash File Creation Techniques 03/27/2002 01:52 PM Please respond to datastage-users Set your group s...
by admin
Wed Mar 27, 2002 9:52 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

Set your group size to 2 on the options screen. That gives a 4096 byte bucket. Divide 800 into 4096 (are you sure you need to store all that in the hash?) giving you 5. Divide 5 into 20,000,000 giving you 4,000,000. Set your minimum modulus to 5,000,000 or so.... -----Original Message----- From: gxg...
by admin
Wed Mar 27, 2002 9:41 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

Hi Galemmo, I am not able to understand what you mean. Ciulc you please clarify? My average record size is 800 bytes, Numer of rows is 20 million. WHat should my parameters be? I do ot understand this statement "Take the average size of your hash data (the data is stored as a variable array) and div...
by admin
Wed Mar 27, 2002 9:36 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

Hi Rick 1. There is only one key for the hash file. 2. The Hash FIle has 15 columns (approx 800 bytes maximum record length). SHiould I try splitting into multiple hash files? Are there parameters to tuine? 3. The array size I use is 20000. Thanks "Rick R. Schirm" To: Subject: RE: Tuning Hash File C...
by admin
Wed Mar 27, 2002 9:33 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

When you select create file click on options. Increase the minimum modulus to an estimate of the number of hash buckets you expect to create. Group size is the size of a bucket. 1 = 2048 bytes and 2 = 4096 bytes. Take the average size of your hash data (the data is stored as a variable array) and di...
by admin
Wed Mar 27, 2002 9:23 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

More information is needed to help with this. 1. What do the keys look like for this file? 2. What is the size of the records being inserted? Rick Schirm -----Original Message----- From: gxganes@regence.com [mailto:gxganes@regence.com] Sent: Wednesday, March 27, 2002 3:20 PM To: datastage-users@oliv...
by admin
Wed Mar 27, 2002 9:20 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Tuning Hash File Creation Techniques
Replies: 16
Views: 2643

Tuning Hash File Creation Techniques

Hi I am creating a huge hash file with 20 million records. The Hash file starts at 3000 rows a second and as time progresses the performance comes down. Can anyone help mw with tuning this? Thanks Gopal =========================================================================== IMPORTANT NOTICE: Thi...
by admin
Wed Mar 27, 2002 4:31 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Data Stage related issues
Replies: 2
Views: 498

Answer to your first question is read the binary file like any other file using sequential file stage and you need to set it as fixed length record with correct record size. Then you would have to convert each record or field in each record into ASCII format and use it for further processing Palanis...
by admin
Wed Mar 27, 2002 8:25 am
Forum: Archive of DataStage Users@Oliver.com
Topic: Data Stage related issues
Replies: 2
Views: 498

Data Stage related issues

Hi guys, I have two questions that I cannot answer. 1. How can the datastage reads from a binary file? 2. I have two jobs, that reads from oracle tables and write to hashed files, with directmapping, and they were working well, and aftre changing some parmeters in the hashed files, they become faste...
by admin
Tue Mar 26, 2002 5:39 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Data not getting past Transformer stage
Replies: 5
Views: 1036

Nicholas, Giovanni

Thanks for the input. It turned out that it was the constraint. The columns in both the first and second constraints needed to match. The first constraint had an extra column:

> AND EW.curve_shift_value = Pano.CURVE_SHIFT_VALUE

I must be blind.

thanks

John B
by admin
Tue Mar 26, 2002 4:36 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Data not getting past Transformer stage
Replies: 5
Views: 1036

Well John, Im confident that everything is perfect. To be sure about that, I suggest you to put another output stage (sequential file) that accept all the record that isnt inserted in the stage before (so the only definition must be Reject Row = Yes for this link). Im quite confident that you will f...
by admin
Tue Mar 26, 2002 4:32 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Data not getting past Transformer stage
Replies: 5
Views: 1036

Do any of your data columns contain NULL? If any value is null, the entire result of the constraint condition is null and will not be output. It will not output to EITHER link. I suggest a reject link to capture those so you can look at the data. -----Original Message----- From: Bidondo, John [mailt...
by admin
Tue Mar 26, 2002 4:23 pm
Forum: Archive of DataStage Users@Oliver.com
Topic: Data not getting past Transformer stage
Replies: 5
Views: 1036

Giovanni The process inside the Transformer stage is this: 1. Get input data from ODBC stage and Hash stage 3. Compare data from ODBC and Hash stages using constraints 4. Send output data to output sequential file 1 or sequential file 2 Im pretty confident that its not the constraints; however, Ive ...