Accepting SOAP over HTTP requests into DataStage

Dedicated to DataStage and DataStage TX editions featuring IBM<sup>®</sup> Service-Oriented Architectures.

Moderators: chulett, rschirm

Post Reply
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Accepting SOAP over HTTP requests into DataStage

Post by getsatish_gk »

Hi all,

DataStage job is deployed and expose and will be invoked by external web service on demand.
I have question...

1) what ds stage should be used in DataStage in order to accept the SOAP over HTTP requests?
2) say the input request contains 5 column, then the DataStage Parameters should have 5 column defined?
3) since, SOAP requests will be in array/xml, which ds stage can be used first read the SOAP? ex xml input
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Post by getsatish_gk »

putting it simple...

is job parameters the only way to accept data from SOAP requests?

I am struggling to get the structured data into Datastage.. so for time being I am not 'grouping to structure' at server console and accepting data successfully (as in job parameters).

so the issue challenge, if we expose Datastage job to accept SOAP over http requests as 'grouped into structure', how to accept the data?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Not sure what you mean by "accept" here. Reading your question at face value I would have suggested the ISD Input stage, and don't understand how job parameters come into the picture at all.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Post by getsatish_gk »

This is were the buck stops... If I use ISD Input Stage then the job becomes "always ON". The requirement is to have job getting invoked by external web services on demand..
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I thought "always on" was an option when you deployed the job rather than something you are forced into... however it's been a long dang time since I've played with them.

Anyway, why not have it always on? It can still be invoked "on demand" and you don't have to wait for the job to start and stop each time. Is there something about the job design that precludes that?
-craig

"You can never have too many knives" -- Logan Nine Fingers
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Post by getsatish_gk »

before deploying, I tried options under 'provider properties' at Server console ie.,
1) Active job instances or JDBC connections : mini=1 (default) max=1
2) idle time: min=60min ; max=0 sec (no changes)
3) activation threshold: service requests=1
4) request limit=1

Let me know if i missed any other setting.... All i need is to run "On Demand" though using ISD input stage
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

You can't. If you want always on...use the ISDInput....if you want "start on demand" (meaning ...start the job, for EACH request coming in, and only when the request arrives), then use some other Stage as the starting point.

....in that case, where do you get the data from? Well, it can come from a database or sequential file source...or, what it sounds like you would like to do, is pass the whole "chunk" of data from the client.

You can do it, but it takes a technique. It won't be as simple as asking ISD to construct the array and proper WSDL for you [as it does when you use ISDinput for "always on"].

One technique, as we discussed above, is to have a dummy input, like a sequential stage that just reads a single row from a dummy file, and then in a downstream transformer, have a Job Parameter as the whole derivation...into a large varchar. Then, downstream from that, break it out any way you want, using pivot, xml, etc. ...all depending on how you want to construct that "array". Of course, the client tooling will have to package that "array" into a single row call, stuffing the "array" into a single textual value for passing as your "Job Parameter". The WSDL will NOT reflect that you are using arrays, but it's still do-able.

How often are you calling this Operation? Doing this makes sense when you call it very seldom (a few times a day only). Otherwise, use ISDinput...the response time will be many times faster.

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Post by getsatish_gk »

Thanks lot ernie, but the issue if not using ISD i/p stage is that, I can't see "Accept Array" option while grouping into structure.

Able to accept one record (comma delimited) with using sequential stage/row generator in the job design ---> i have option to accept comma delimited data

I am now really feeling to have premium content access... but.. i just need the right stage to start with (either to accept array of data or comma delimited)
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Post by getsatish_gk »

Updated to a similar topic at PX forum...

Finally, able to accept array of data from web services thorough ISD input stage with the below provider setting at IIS console to run job only on demand
Active Job instance = 1 (max)
Service request and limit=1(max)

thus, restricting job not run always

But, there is wired job status at the log even though job is completed successfully..

Finished Job abcd.1
Attempting to Cleanup after ABORT raised in job abcd.1
Job abcd.1 aborted.

Let me know how to get rid of the last two log entries..
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

:!: You posted your "wired job status" question in three posts when it only needed to be in one. I've removed it from the other two... we can talk about it here.
-craig

"You can never have too many knives" -- Logan Nine Fingers
getsatish_gk
Participant
Posts: 104
Joined: Sat Dec 24, 2005 1:26 am
Location: Bengaluru

Post by getsatish_gk »

Thanks chulett..
Having two issues,
1) job log shows below after job Finished.
Attempting to Cleanup after ABORT raised in job abcd.1 Job abcd.1 aborted.

2) In the designer, performance statics doesn't show
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

Jobs deployed under ISD as services are "multiple instance" and will be started and stopped continually, depending on traffic coming into your application. There are no particular stats in the Designer that you can expect to use...it's not a "batch" job in the common sense. Monitoring of Web Services (ISD) traffic is a whole other subject...

...each instance, when finishing, shouldn't abort, but that might just be a clean-up artifact, and might depend on what you are doing in the Job and its Stages.... is your client tool receiving the data as expected?

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

...by the way....what kinds of SOAP clients are calling your ISD services?

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
Post Reply