Search found 143 matches
- Thu Apr 26, 2012 1:16 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex flat file with multiple records
- Replies: 10
- Views: 6881
- Thu Apr 26, 2012 1:12 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error in loading to table
- Replies: 9
- Views: 6250
- Thu Apr 26, 2012 11:52 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata Connector issue
- Replies: 15
- Views: 11907
We are also using Teradata 13.10. If I remember correctly, when we installed Teradata we actually copied the entries added by teradata install in /etc/profile to the dsenv and commented out these entries in /etc/profile file. The entries in red are the entries we commented out in /etc/profile and we...
- Thu Apr 26, 2012 11:35 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex flat file with multiple records
- Replies: 10
- Views: 6881
Ria, How are you trying to read the file in parallel job? Is it through view data? Did you checked the record layout? It is very easy work with CFF stage in a server job. Because a server job doesn't complain about a data type and I also read some where CFF stage was initially a server stage later b...
- Fri Dec 16, 2011 11:39 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: CFF Multiple Record Types
- Replies: 13
- Views: 10680
Hi Devo, No it won't work. The reject mode only works for CFF stage with single record definition. For Multiple record definition you get the following error: Record format type=implicit: cannot save rejected records In fact we opened a PMR with IBM and an enhancement request was created for this is...
- Fri Dec 16, 2011 11:28 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata 6706 error
- Replies: 5
- Views: 13350
Hi Sravya, What version or Teradata you are using? We set up our teradata connector stage with following options and the special characters you showed in your post get loaded through DataStage. Variant:12 Transaction Mode: ANSI Client Character Set: LATIN1_0A Automap Characterset coding: Yes We conf...
- Fri Dec 16, 2011 11:11 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex Flat file- Multiple Record Types
- Replies: 19
- Views: 23390
Your definition should match your input file. It is not about whether you can add filler to make the record lengths same but what exactly is being sent to you from Mainframe for each type of record. You can request Mainframe group to send you a screen shot of each record type or they can browse the ...
- Thu Dec 15, 2011 7:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex Flat file- Multiple Record Types
- Replies: 19
- Views: 23390
The record format type implicit is generated when you define multiple records in the records tab. Could you create the below job to debug your record definition? All CFF stages should have single record definition. Some of my jobs in production have this design and surprisingly this runs faster than...
- Tue Dec 13, 2011 8:41 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex Flat file- Multiple Record Types
- Replies: 19
- Views: 23390
These are my settings for Multiple record CFF stage: record {record_format={type=implicit}, delim=none, quote=none, binary, ebcdic, native_endian, charset='ISO-8859-1', round=round_inf, nofix_zero} I never tried a CFF extract job without a transformer stage as the data coming from Mainframe usually ...
- Mon Dec 12, 2011 6:23 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex Flat file- Multiple Record Types
- Replies: 19
- Views: 23390
- Mon Dec 12, 2011 6:18 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex Flat file- Multiple Record Types
- Replies: 19
- Views: 23390
- Mon Dec 12, 2011 2:42 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex Flat file- Multiple Record Types
- Replies: 19
- Views: 23390
- Wed Nov 16, 2011 11:24 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Buffer size
- Replies: 2
- Views: 2979
This is from IBM technote on maximum buffer. Hope this helps in your case. Problem(Abstract)When using the Load operator in TPT, the max buffer size allowed is 64260 kb as referenced in the Teradata Parallel Transporter Guide (on page 137). What this means is even if for example, 1000000 kb is spec...
- Wed Nov 09, 2011 2:18 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Best way to Read N number of tables and load into 1 table
- Replies: 13
- Views: 5764
How about passing the table name as a parameter within a sequence job? 1. job 1 -- Read control table and create delimited seq file with all the tables to be loaded. 2. begin the loop 3. parse file created in step 1 for (next) source table name with User variable activity stage 4. Using the source t...
- Thu Oct 20, 2011 7:13 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Strange Input buffer overrun problem
- Replies: 4
- Views: 4889