Thank you for suggestion!
The first field (4 binary) has the total length of record.
Regarding binary FTP with EBcdic, does it work for the fields that are numeric, comp fields as well parsing thru CFF stage?
Thank you!
Search found 59 matches
- Tue Jul 31, 2018 9:45 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: FTP Enterprise stage to read Mainframe variable blk dataset
- Replies: 4
- Views: 3312
- Tue Jul 31, 2018 12:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: FTP Enterprise stage to read Mainframe variable blk dataset
- Replies: 4
- Views: 3312
FTP Enterprise stage to read Mainframe variable blk dataset
Hi, I am trying to read data from a mainframe variable block (VB) dataset with complex structure (multiple record types) using FTP Enterprise Mainframe dataset properties: RECFM = VB LRECL = 35 DCB -- none Layout: FLD LEN FORMAT REC-LEN 04 BINARY SEG-TYPE 04 CHAR ROOT-SEG-KEY 15 CHAR LINE-NO 03 BINA...
- Mon Aug 26, 2013 1:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Datastage processing to Drop both duplicates.
- Replies: 1
- Views: 1536
Datastage processing to Drop both duplicates.
I am looking for simplest way to drop all the duplicates from a sequential file. Input sequential file: sno,sname 1,A 2,B 3,C 1,D 5,X 2,E Desired output sno,sname 3,C 5,X The requirement is to skip processing of Sno = 1 , 2 records as the key sno has duplicates in the input. Appreciate all the help ...
- Wed Aug 10, 2011 9:49 am
- Forum: General
- Topic: Notification activity -hyperlink
- Replies: 3
- Views: 2323
- Wed Aug 10, 2011 9:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata connector issue while loading the data
- Replies: 5
- Views: 5329
Re: Teradata connector issue while loading the data
Is your input 17million records? If yes, It's not too big to get any issues with load. We were running into issues when we load billions of records in one streach, so the approach we used to split the data to load in smaller chunks (~800 million in one execution) and loop thru to process all data. H...
- Fri Aug 05, 2011 9:53 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata connector issue while loading the data
- Replies: 5
- Views: 5329
Re: Teradata connector issue while loading the data
Manu, Need more details to be able to help you. Are you using any load utility? (Access method bulk??) Are you using parallel sync? if yes, what are the parameters used? What config are you running this on? (Grid or cluster)? The solution would vary depending on the details for above qns.. It may be...
- Fri Aug 05, 2011 9:44 am
- Forum: General
- Topic: Notification activity -hyperlink
- Replies: 3
- Views: 2323
Notification activity -hyperlink
hi,
I am wondering if there a way to send an hyperlink to a file in email body using notification activity in job sequence? It's nice to have the user clicking the hyperlink in email body to get the report file instead of copy pasting the path in the browser/explorer address.
Thanks
Chandra.
I am wondering if there a way to send an hyperlink to a file in email body using notification activity in job sequence? It's nice to have the user clicking the hyperlink in email body to get the report file instead of copy pasting the path in the browser/explorer address.
Thanks
Chandra.
- Mon Mar 07, 2011 12:48 pm
- Forum: General
- Topic: Reset the loop when processing data
- Replies: 3
- Views: 2895
Re: Reset the loop when processing data
Thanks for the reply.. But, if condtion not satisfied? the loop has to keep running for all it's iterations. I only want to reset the loop counter back to 1 (start of loop) if the data files exist... In another way I want to capture how many times the loop ran with no files and I do not need to incr...
- Mon Mar 07, 2011 8:41 am
- Forum: General
- Topic: Reset the loop when processing data
- Replies: 3
- Views: 2895
Reset the loop when processing data
Hi, I have a job sequence design that runs in loop for x times with two branches, branch1, if no files to process then it does sleep and keep on running in loop. branch2, if there are files to process then reset the loop limit to start the counter from 1 to run for x times. Basically I want to run t...
- Mon Sep 20, 2010 1:21 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata Connector TPT- Sync Timeout errors
- Replies: 2
- Views: 5255
Thank you for your valuable inputs. My job is tuned to perform optimally. The issue with sync-timeout happening more often when there is huge data to be processed and almost every time the partition 0 is not reaching the approriate status. I tried with max sync time out of 600. Is there any other so...
- Fri Sep 03, 2010 2:57 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata Connector TPT- Sync Timeout errors
- Replies: 2
- Views: 5255
Teradata Connector TPT- Sync Timeout errors
I am using Teradata connector with Bulk/Update using Parallel Syncronization on a 8 node Grid config. The job is some times failing with following error: "Sync timeout of 200 seconds expired while waiting for the other instances to reach state 9,000 (CC_TeraAdapter::waitForState, file CC_TeraAd...
- Thu Sep 02, 2010 12:21 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata Connector TPT Bulk performance
- Replies: 1
- Views: 3150
Teradata Connector TPT Bulk performance
I am working on tuning the performance of a job that processes 2-3 billion records. The job Design is read from sequential file [using file pattern, as there will be many files], transform data using transformer and load to Teradata target table using Teradata connection with Bulk/update and Paralle...
- Sat Jul 17, 2010 1:36 pm
- Forum: General
- Topic: Datastage Multiple instance job to run 'n' times using parm
- Replies: 16
- Views: 26047
Datastage Multiple instance job to run 'n' times using parm
Hi, I have a datastage multiple instance job with name 'JobA', it has a parameter 'ParmA'. I want to run the job number of times of the value of ParmA. Example, if the ParmA value is = 2, the JobA has to run two times with instance id 1 and instance id2. If the ParmA value is 5, The job should run 5...
- Wed Apr 07, 2010 12:50 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Teradata connector Vs Teradata enterprise stage
- Replies: 1
- Views: 3939
Teradata connector Vs Teradata enterprise stage
I have a job to read from teradata and write into a dataset. This job is performing much better if I use enterprise stage instead of connector stage. I am not using bulk option. But, enterprise stage is getting 8 processes where as connector stage using only one process. Both jobs (connector and ent...
- Mon Mar 22, 2010 12:22 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reading files using file pattern -Is there a max limit?
- Replies: 14
- Views: 13314