Page 1 of 1

STRING to Packed hexadecimal

Posted: Fri Mar 09, 2018 3:45 pm
by jagan02
Hi DataStage Gurus,

I am trying to convert a TEXT = '0123456' to PD Hexadecimal value = '01 23 45 6C' - where C - is sign digit.

for the destination data types of PIC S9(6) COMP-3.
to send over to Mainframes as a DATASET.

I used below possible ways and not able to achieve in either of this(getting altogether different results).
1) CFF in parallel job with Record options->Data format -> Binary.
2) ICONV function with both 'MP' and 'MY' in server job.
I am testing these (CFF and ICONV) results by transmitting over to Mainframes to check for the successful conversion.

Please advise me on how to achieve this conversion in DataStage.

Posted: Mon Mar 12, 2018 11:01 am
by FranklinE
I don't have a text to packed decimal example, which surprised me a bit. I do have a PD to text example.

Be sure the out-link column is properly defined as signed packed decimal. In a transformer use StringToDecimal, and see what happens.

I would expect it to work. It works perfectly in reverse.

Posted: Tue Mar 13, 2018 8:09 am
by jagan02
Thanks for the response : FranklinE
yes, I tried all you said - signed packed decimal and have used StringTo Decimal and few others but I am not getting the value that I am expecting.

Input = 123456
Current Output = 01 7B C5 93
But expected Output is = 01 23 45 6C

I believe there is something going on with decimal rounding, DataStage thinks each of this digits as individuals and try to round them up.
Please suggest optimal values for below decimal- env variables.
APT_DECIMAL_INTERM_PRECISION
APT_DECIMAL_INTERM_SCALE
APT_DECIMAL_INTERM_ROUNDMODE
DS_USECOLPREC_FOR_DECIMAL_KEY

Posted: Tue Mar 13, 2018 10:13 am
by FranklinE
There's a way to be sure by doing several steps instead of a direct transformation.

From input: Concatenate a leading sign to the Char "123456" into a Char stage variable.

StringToDecimal from stage variable to a signed stage variable defined as Decimal.

DecimalToDecimal from second sv to the COMP-3 column.

Check the endian settings in CFF. Sometimes it defaults to big-endian.

Finally: your current output indicates that the column in CFF is not actually defined as packed decimal, but as binary integer.

Keep at it, experiment.

Re: STRING to Packed hexadecimal

Posted: Tue Mar 13, 2018 10:16 am
by chulett
To Franklin's point:
jagan02 wrote:1) CFF in parallel job with Record options->Data format -> Binary.
Have you actually tried specifying it as Packed Decimal? If so can you post those results please.

Posted: Tue Mar 13, 2018 11:59 am
by jagan02
@FranklinE - no luck - getting the same results.

@Franklin E and chulett - Thanks for the response.
In CFF - there is no specific option to choose the column/record as 'Packed' - other than having the column itself as comp-3.
and also below text from IBM's CFF documentation reveals that setting the record options - Data Format as 'Binary' will perform the Packed Decimal conversion.

I have 'Byte order' as Native endian, the other two (big and little) doesn't help either.

Data format: Specifies the data representation format of a column. Select one of the following
options:
Binary: Field values are represented in binary format, and decimals are represented in packed decimal format. This option is the default.
Text: Fields are represented as text-based data, and decimals are represented in string format.

Posted: Tue Mar 13, 2018 10:29 pm
by ray.wurlod
If you use a BASIC Transformer the following expression would suffice.

Code: Select all

Fmt(InLink.ColName : (If InLinkColName < 0 Then "D" Else "C"), "L## ## ## ##")

Posted: Wed Mar 14, 2018 9:58 am
by UCDI
looks like a job for an external c or VB etc routine. This is like 5 lines of C, and its not terribly bad in VB (slightly more wordy, but only slightly). I dislike the VB transformer as it can damage some data (specifically it mangles timestamps with microseconds) and I think it is not parallel friendly, but I have used it and its fast and efficient for some tasks. VB is OK at text processing but its not really a bits and bytes language (it can do it, but its grumpy about it).

I think you could do it any of many ways in datastage, but they all seem a little clunky... integer of (left of right of left of 2 digits ) would work, and then shift and shuffle the bytes as needed; eg stage variable for each byte explicitly pulled out, then rebuild them byte-wise in the assignment of the transform, for one approach.

if your cpu is intel family, it can endian flip a register directly without additional code at amazing speeds. If you want to go REALLY low level and stick a line of assembly in your code (which again points to C, which supports doing that). I don't think the RISC chips have this instruction.

Posted: Wed Mar 14, 2018 10:08 am
by FranklinE
Respectfully, solutions that are outside the CFF and Transformer stages and their default functions should be considered last resort. The entire point of CFF is to make work with EBCDIC formats out-of-the-box.

Your amount column attributes should look like the following on the Records tab of the CFF stage:

Sign indicator = Signed
Usage = COMP-3

SQL type = Decimal
Storage length = [hexadecimal bytes, in your example 4]
Picture = PIC S9(6) COMP-3. [from your example]

The Record options tab should have the following:

Data format: Binary
Character set: EBCDIC

If any of these minimally critical attributes do not match your column, make them match and retry.

Posted: Wed Mar 14, 2018 10:16 am
by FranklinE
Additional thoughts:

You don't need to deliver the file to the mainframe to verify it. Use View Data in CFF Output tab, which will carry the column and file attributes you need. If it looks right in View Data, then you may have a problem with the method you use to put the file on the mainframe.

How did you construct the table definition in CFF? Did you do it manually? If so, you may have a typo there. If you used the Import Wizard for CFD, post the CFD file you used, and that might help me find your issue.

Posted: Thu Mar 15, 2018 8:53 am
by jagan02
Thanks for assistance FranklinE.
yes, I am using Import wizard to translate the COBOL copybook to CFD, Please find below CFD layout.

record
{record_format={type=implicit}, delim=none, quote=none, binary, big_endian, charset='UTF-8', round=round_inf, fix_zero}
(
col1:string[2];
col2:string[1];
col3:string[8];
col4:string[1];
col5:string[1];
col6:string[2];
col7:string[1];
col8:string[1];
col9:decimal[6,0] {packed};
.
.
.
)

Posted: Thu Mar 15, 2018 9:13 am
by FranklinE
The first bug: col9 should have the signed attribute.

Can you post the CFD file you used in the Wizard?

Edit: Charset should be EBCDIC. Strike this.

Sample line for one of my files:

Code: Select all

record_length=200, delim=none, quote=none, binary, ebcdic, native_endian, charset='ISO8859-1', round=round_inf, nofix_zero}
I don't know that UTF-8 supports EBCDIC. If you have ISO8859-1 available, you may need to switch to that. If you don't have the ebcdic attribute, you're not going to get a good output.

Posted: Thu Mar 15, 2018 9:24 am
by jagan02
Thanks for the options FraklinE, i got it worked finally, I was missing Character set = EBCDIC and while transferring to mainframes I was missing to transfer in Binary mode.

Posted: Tue Mar 20, 2018 8:32 am
by UCDI
That would do it. Sorry for any confusion.

@ Franklin, I agree for the most part. External code is not totally my last resort, but it is close. There are some things that you CAN do in DS that are just too convoluted to maintain and deal with. I think this one was me not understanding what the OP actually wanted/needed :)

Posted: Tue Mar 20, 2018 8:59 am
by FranklinE
Jagan had trouble composing his questions, and there's no fault to him for that. Nine-tenths of support is getting the right questions asked, and I've been on both sides of that challenge many times.