Dynamic array notation in sequences

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
IASAQ
Premium Member
Premium Member
Posts: 31
Joined: Wed May 04, 2016 11:07 am
Location: Montréal

Dynamic array notation in sequences

Post by IASAQ »

User chulett wrote this at one point:
Another option, since the results are returned in a dynamic array, is to use array notation. That can help simplify things if you ever need to return something other than the first item in a list of returned values.

Execute_Command_1.$CommandOutput<1>
Can somebody explain what are the limitations to use this construct in sequences? The only place where this seems to work for me in a sequence is in triggers and even there it's weird. For example:

Code: Select all

Execute_Command_1.$CommandOutput<1>= "N"      will work but
Execute_Command_1.$CommandOutput<1> = "N"     will not work.
If I want to initialize a user variable in a sequence and use the construct mentioned by chulett, it doesn't work. I get an "Expected expression" in the User Variables Activity window. Any idea why?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

That chulett guy hardly ever knows what the heck he is talking about. :wink:

I recall them working fine in triggers but as you've found you have to be precise in how you code them. I would stick with no spaces:

Code: Select all

Execute_Command_1.$CommandOutput<1>="N"
I learned about them from studying and ultimately modifying the crap out of Ken Bland's awesome Job Control Utilities code (way) back in the day and that's where I primarily used it - in "job control" code. Keep in mind the fact that it is dynamic array notation so you can't use it to "initialize a variable". You would first need to declare something to be a dynamic array and then you could use that notation to initialize or manipulate the elements in the array, however. Sorry, I don't recall off the top of my head how you declare one, I'll see if I can dig it up later from my archives if one of the Old Guard doesn't come along today and help out.

I'm not sure how useful they would be in a User Variables stage or if they are even applicable there. Save them for a custom routine you call from one perhaps, that or triggers. Or some super fancy job control code. 8)
-craig

"You can never have too many knives" -- Logan Nine Fingers
IASAQ
Premium Member
Premium Member
Posts: 31
Joined: Wed May 04, 2016 11:07 am
Location: Montréal

Post by IASAQ »

It all goes back to my first inquiry on passing info from job to job in a sequence. I find the whole process extremely convoluted with Datastage:

Write file with info needed for next job
Read file with execute command stage
Convert @FM in $CommandOutput
Use Field function to get value needed for next job in the sequence.

All this to be able to use variables dynamically in a job sequence.

Thanks chulett. You pretty much answered my suspicion that the dynamic array construct you posted a while ago was for custom routines.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Okay, in that context...

There's no reason to go through all of those steps for "dynamic" parameters, be it in a sequence or otherwise. That is a very old school approach, something we had no choice but to do Back In The Day as there was no other mechanism. Well, other than USERSTATUS which I think we've already talked about.

Nowadays you should be looking into Parameter Sets, specifically with the concept of Value Files they added a number of years ago. Learning those would go a long way towards greatly simplifying what I assume it is you need to do with parameters. If you want to have a conversation on that, I would suggest starting a new post. Or going back to your old one? Did we have this conversation already? :wink:
-craig

"You can never have too many knives" -- Logan Nine Fingers
IASAQ
Premium Member
Premium Member
Posts: 31
Joined: Wed May 04, 2016 11:07 am
Location: Montréal

Post by IASAQ »

chulett wrote:Nowadays you should be looking into Parameter Sets, specifically with the concept of Value Files they added a number of years ago. Did we have this conversation already? :wink:
Indeed we did. But here's an example where I believe parameter sets and value files won't work: Notification activity stages.

The recipients email address is not static. It's obtained from a job activity stage prior to the notification activity stage. So in order to put the proper email address, I had to do the steps mentioned in my post above and ended up with something like this in the recipients email address field:

Code: Select all

Field(#eMail.v_mail_adr#, ";", #init_mail_adr.v_idx_email#)
I don't think the parameter set value file combo would have worked in this instance. But I may be wrong of course.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Okay, fair enough... what exactly is the job in the Job Activity stage doing to generate the email address list? Guessing it gets them from a table somewhere and then writes a delimited list out to a file? Clarify that for us please and we'll see if we can improve this part of the process for you.
-craig

"You can never have too many knives" -- Logan Nine Fingers
IASAQ
Premium Member
Premium Member
Posts: 31
Joined: Wed May 04, 2016 11:07 am
Location: Montréal

Post by IASAQ »

Yes, the info is taken from a table then in the job, I append two files with the info:

First file is a parameter set value file and the second file contains only the value, all done in part via the transformer stage loop condition.

So my PS value file looks like this:

name1=value1
name2=value2
name3=value3

and my "user status" file looks like this:

value1
value2
value3

The parameter set value file could be redundant, but I keep it in case it's required later on is the job sequence.

My "user status" file is what I read in the job sequence for triggers and variables.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Okay. Just to level set, I assume this job to produce those files is a Parallel job. If so, are you at one of those shops where Server jobs are forbidden? You could easily use one to read the table and build a 'ready to go' delimited string and actually use the true USERSTATUS area to pass that along to anything downstream of the job. Automagically. With no shenanigans.

Is that a possibility?

:?: Question for the home audience, am I correctly remembering that you need a Server job to write to USERSTATUS? I'm thinking a Parallel job has no such structure available... or would you if you added a BASIC Transformer to the PX job? Someone throw me a bone here...

Thanks.
-craig

"You can never have too many knives" -- Logan Nine Fingers
IASAQ
Premium Member
Premium Member
Posts: 31
Joined: Wed May 04, 2016 11:07 am
Location: Montréal

Post by IASAQ »

It is certainly a possibility and if fact, when I started working with the product and looking for a solution to this problem, the server job "userstatus" was the answer.

However, reading further on this forum and also in official documents on the products, I read that mixing server jobs with parallel jobs was not considered good practice and that parameter sets was the answer. So I've been trying to go with the recommended practice, with the gymnastic it entails.

One thing I could see from reading this forum is that handling variables dynamically with Datastage seems to be a problem many users face.

Edit: To answer your question at the end of your last post, yes you need a server job for userstatus. Or a BASIC transformer in a parallel job, which is not recommended either.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Sorting isn't recommended either (since it consumes resources).

Unless you need it.


I have no problem at all mixing job types or using the BASIC Transformer stage, in those circumstances where it's warranted.
An admonition that originally came with a caveat appears to have metamorphosed into some kind of absolute.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Thanks Ray, I was really hoping someone else would pop in here and help out. Ray for one is always championing using the right tool for the right job and that means Server jobs when there's really no need for all the overhead of a Parallel one to do something pretty straight-forward, like as a timely example sourcing from a table and writing the results to a file.

I'm not sure where you would have read the "mixing" them was not a good practice. That or using a BASIC Transformer in a Parallel job. IMHO it's not about them being "not recommended", it's more about understanding how to leverage them properly, how they co-exist, especially when you throw in the wrinkle of a cluster or grid based topology. Otherwise in the absence of cluster/grid I'm not really sure what the issue might be, perhaps as noted a caveat has somehow morphed into some kind of absolute.
IASAQ wrote:So I've been trying to go with the recommended practice, with the gymnastic it entails.
It's statements like this that concern me. Nothing in the way of a recommended or Best Practice should include "gymnastics" nor does it here. I seem to recall you were thrown directly into the deep end of the pool? Meaning, self-learning from the documentation, experimentation and silly places like this? If that is the case, hopefully there is a chance for a class or two in your future, that or some consulting help on site. Should go a long way to help out with your gymnastics career. :wink:

Anyway, not sure how far you want to take this. Any interest in pursuing the Server job and USERSTATUS solution? Or have we beaten this poor little pony enough already?
-craig

"You can never have too many knives" -- Logan Nine Fingers
IASAQ
Premium Member
Premium Member
Posts: 31
Joined: Wed May 04, 2016 11:07 am
Location: Montréal

Post by IASAQ »

chulett wrote:I seem to recall you were thrown directly into the deep end of the pool? Meaning, self-learning from the documentation, experimentation and silly places like this?

Anyway, not sure how far you want to take this. Any interest in pursuing the Server job and USERSTATUS solution? Or have we beaten this poor little
pony enough already?
Yeah, pretty much on your first point. As for the last sentence, yes, I'm quite done with that poor little pony. I'll mark this thread as resolved.
Teej
Participant
Posts: 677
Joined: Fri Aug 08, 2003 9:26 am
Location: USA

Post by Teej »

Unfortunately, Basic Transformer is not something we really advocate these days, thanks to our MPP efforts. For example, your job would not be able to be migrated to an environment using Hadoop YARN.

Set up a stand-alone server job if necessary. Avoid cross-pollinating Server BASIC within the Parallel framework.
Developer of DataStage Parallel Engine (Orchestrate).
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Teej's "we" is not my "we", and we will agree to disagree on that. I would agree that the BASIC Transformer is contra-indicated for huge data volumes, but would counter that with the observation that the majority of DataStage jobs that I have encountered process only small to medium data volumes, so that the limitations of the BASIC Transformer stage's execution environment are still acceptable.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Teej
Participant
Posts: 677
Joined: Fri Aug 08, 2003 9:26 am
Location: USA

Post by Teej »

We = IIS development team.

In 11.3 (if not 9.1), we depreciated the appearance of the Basic Transformer stage in the default palette. It may be eliminated in the future (as there are some legacy stages that have indeed been eliminated in 11.5. For customers using those stages, they will need to go up to 11.3, and replace those stages with the new stages before upgrading to 11.5 and above. This can be quite a painful process for some customers.

So take it as a strong hint to not use Basic Transformer within parallel jobs for any new development. If you wish to use DataStage Basic, please build Server jobs.

Please note, the vast majority of functionality within Basic Transformer was recreated from scratch within Parallel's general Transformer function. If you see any functionality you need, please reach out to Support and/or open a Request for Enhancement.

-T.J.
Developer of DataStage Parallel Engine (Orchestrate).
Post Reply