EBCDIC TO EBCIDIC

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
irajasekharhexa
Premium Member
Premium Member
Posts: 82
Joined: Fri Jun 03, 2005 5:23 am
Location: Bangalore
Contact:

EBCDIC TO EBCIDIC

Post by irajasekharhexa »

As a part of Extract and Loading we have two type of loading

1. source is EBCDIC AND TARGET IS ALSO EBCDIC
2.source is ASCII AND TARGET IS ASCII format.

Can any body who has worked on COBOL files and loading on to COBOL FILES can throw some light on the extracting and loading data to and from the EBCDIC file.

I heared that if the Coblol file is flatened and used as a soucrce that makes easy. Is any software/Utility is available to flaten the complex flat files.


Regds
Rajasekhar
Aruna Gutti
Premium Member
Premium Member
Posts: 145
Joined: Fri Sep 21, 2007 9:35 am
Location: Boston

Post by Aruna Gutti »

Hi Rajasekhar,

You can flatten the Cobol FD using CFF Stage. Did you tried that and having issues with that ?

Loading from EBCDIC to EBCDIC and ASCII to ASCII should be the options you select within CFF stage.

Regards,

Aruna.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

If the record schema is simple, the Sequential File stage also has the ability to "read as EBCDIC" or "write as EBCDIC".
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
irajasekharhexa
Premium Member
Premium Member
Posts: 82
Joined: Fri Jun 03, 2005 5:23 am
Location: Bangalore
Contact:

Post by irajasekharhexa »

ray.wurlod wrote:If the record schema is simple, the Sequential File stage also has the ability to "read as EBCDIC" or "write as EBCDIC".
Ray & Aruna
Thanks for the inputs. As of now we about to moved to Build phase.
As soon as I do some experiment with Cobol files . Probably i'll come up with more info on this in another copule of days.
Rajasekhar
irajasekharhexa
Premium Member
Premium Member
Posts: 82
Joined: Fri Jun 03, 2005 5:23 am
Location: Bangalore
Contact:

Thanks

Post by irajasekharhexa »

ray.wurlod wrote:If the record schema is simple, the Sequential File stage also has the ability to "read as EBCDIC" or "write as EBCDIC".
Ray & Aruna
Thanks for the inputs. As of now we about to moved to Build phase.
As soon as I do some experiment with Cobol files . Probably i'll come up with more info on this in another copule of days.
Rajasekhar
irajasekharhexa
Premium Member
Premium Member
Posts: 82
Joined: Fri Jun 03, 2005 5:23 am
Location: Bangalore
Contact:

Re: Thanks

Post by irajasekharhexa »

irajasekharhexa wrote:
ray.wurlod wrote:If the record schema is simple, the Sequential File stage also has the ability to "read as EBCDIC" or "write as EBCDIC".
Ray & Aruna
Thanks for the inputs. As of now we about to moved to Build phase.
As soon as I do some experiment with Cobol files . Probably i'll come up with more info on this in another copule of days.
I imported the Cobol file definations, But now i need to create the test data for that for further process, I am not sure how to do this.
Can anyone shed some light pls.
Rajasekhar
Aruna Gutti
Premium Member
Premium Member
Posts: 145
Joined: Fri Sep 21, 2007 9:35 am
Location: Boston

Post by Aruna Gutti »

Hi Rajasekhar,

The test data should be coming from source.

For Example :

If your source Cobol flat file is coming from Mainframe you need to get the file from Mainframe as input source via ftp plugin or CFF stage.

This is best way of testing rather than creating a file manually to avoid any incompatiblity issues.


Good Luck,

Aruna.
irajasekharhexa
Premium Member
Premium Member
Posts: 82
Joined: Fri Jun 03, 2005 5:23 am
Location: Bangalore
Contact:

Post by irajasekharhexa »

Aruna Gutti wrote:Hi Rajasekhar,

The test data should be coming from source.

For Example :

If your source Cobol flat file is coming from Mainframe you need to get the file from Mainframe as input source via ftp plugin or CFF stage.

This is best way of testing rather than creating a file manually to avoid any incompatiblity issues.


Good Luck,

Aruna.


Friends


If we need to create the test data for the COBOL files or PL/1 files manually how to do that.

Some of the files we imported in PL/1(As we supposed to get in copy book format client has given in PL/1 format) by sequntial stage. But again test data creation is the issue. Can we import the file in PL/1 and can use the mainframe data for the testing?

As there is some delay in getting the test data we should create minimum quantity of the test data to go ahead for the build.

Can i expect some one to give the sample cobol file which is having some test data?




Your help will be highly appreciated.
Rajasekhar
Aruna Gutti
Premium Member
Premium Member
Posts: 145
Joined: Fri Sep 21, 2007 9:35 am
Location: Boston

Post by Aruna Gutti »

What kind of data your file is expected to contain ? If it is not packed decimal or signed numeric you can just create a text file with and use it as input. Make sure the test file columns map correctly to your copy book.

Otherwise if you have access to mainframe you can create data using tools like file aid.
Post Reply