Search found 143 matches

by Aruna Gutti
Thu Apr 26, 2012 1:16 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex flat file with multiple records
Replies: 10
Views: 6866

I am not sure whether Version 7 parallel CFF stage support reading multiple records. I only worked on multiple record formats in version 8. For our version 7 we used server jobs for CFF multiple record layouts.
by Aruna Gutti
Thu Apr 26, 2012 1:12 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error in loading to table
Replies: 9
Views: 6199

Hi Deepu, If upsert is taking too long you can try this. If you have more number of inserts than updates, use insert option to insert all rows and create a reject link for update rows and vice versa. This design worked for me when I was using odbc enterprise stage. Hope this helps. Aruna.
by Aruna Gutti
Thu Apr 26, 2012 11:52 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata Connector issue
Replies: 15
Views: 11807

We are also using Teradata 13.10. If I remember correctly, when we installed Teradata we actually copied the entries added by teradata install in /etc/profile to the dsenv and commented out these entries in /etc/profile file. The entries in red are the entries we commented out in /etc/profile and we...
by Aruna Gutti
Thu Apr 26, 2012 11:35 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex flat file with multiple records
Replies: 10
Views: 6866

Ria, How are you trying to read the file in parallel job? Is it through view data? Did you checked the record layout? It is very easy work with CFF stage in a server job. Because a server job doesn't complain about a data type and I also read some where CFF stage was initially a server stage later b...
by Aruna Gutti
Fri Dec 16, 2011 11:39 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: CFF Multiple Record Types
Replies: 13
Views: 10642

Hi Devo, No it won't work. The reject mode only works for CFF stage with single record definition. For Multiple record definition you get the following error: Record format type=implicit: cannot save rejected records In fact we opened a PMR with IBM and an enhancement request was created for this is...
by Aruna Gutti
Fri Dec 16, 2011 11:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata 6706 error
Replies: 5
Views: 13253

Hi Sravya, What version or Teradata you are using? We set up our teradata connector stage with following options and the special characters you showed in your post get loaded through DataStage. Variant:12 Transaction Mode: ANSI Client Character Set: LATIN1_0A Automap Characterset coding: Yes We conf...
by Aruna Gutti
Fri Dec 16, 2011 11:11 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat file- Multiple Record Types
Replies: 19
Views: 23230

Your definition should match your input file. It is not about whether you can add filler to make the record lengths same but what exactly is being sent to you from Mainframe for each type of record. You can request Mainframe group to send you a screen shot of each record type or they can browse the ...
by Aruna Gutti
Thu Dec 15, 2011 7:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat file- Multiple Record Types
Replies: 19
Views: 23230

The record format type implicit is generated when you define multiple records in the records tab. Could you create the below job to debug your record definition? All CFF stages should have single record definition. Some of my jobs in production have this design and surprisingly this runs faster than...
by Aruna Gutti
Tue Dec 13, 2011 8:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat file- Multiple Record Types
Replies: 19
Views: 23230

These are my settings for Multiple record CFF stage: record {record_format={type=implicit}, delim=none, quote=none, binary, ebcdic, native_endian, charset='ISO-8859-1', round=round_inf, nofix_zero} I never tried a CFF extract job without a transformer stage as the data coming from Mainframe usually ...
by Aruna Gutti
Mon Dec 12, 2011 6:23 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat file- Multiple Record Types
Replies: 19
Views: 23230

Also could you please make sure the record definition is correct for DETAIL?
The error looks like a file definition issue to me. As you are dealing with Fixed length CFF file make sure your COBOL file definition is showing the correct record length for each type of record.
by Aruna Gutti
Mon Dec 12, 2011 6:18 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat file- Multiple Record Types
Replies: 19
Views: 23230

Hi,

What is your job design? I usually have one input CFF stage with multiple output links each going to a transformer stage and then to an output file.

Aruna.
by Aruna Gutti
Mon Dec 12, 2011 2:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex Flat file- Multiple Record Types
Replies: 19
Views: 23230

What is the error message you are getting when trying to run the job with multiple record types?

Also I am not sure whether it makes a difference, I usually enclose my Records Id value in double quotes instead of single quote.

Aruna.
by Aruna Gutti
Wed Nov 16, 2011 11:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Buffer size
Replies: 2
Views: 2957

This is from IBM technote on maximum buffer. Hope this helps in your case. Problem(Abstract)When using the Load operator in TPT, the max buffer size allowed is 64260 kb as referenced in the Teradata Parallel Transporter Guide (on page 137). What this means is even if for example, 1000000 kb is spec...
by Aruna Gutti
Wed Nov 09, 2011 2:18 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Best way to Read N number of tables and load into 1 table
Replies: 13
Views: 5734

How about passing the table name as a parameter within a sequence job? 1. job 1 -- Read control table and create delimited seq file with all the tables to be loaded. 2. begin the loop 3. parse file created in step 1 for (next) source table name with User variable activity stage 4. Using the source t...
by Aruna Gutti
Thu Oct 20, 2011 7:13 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Strange Input buffer overrun problem
Replies: 4
Views: 4855

This issue is resolved. We tracked the issue down to a parallel process that is copying the file at the time job is running.