HI,
I am using a Web Services Transformer stage , it is not able to handle atleast 4000 rows of data, it gets aborted.
it runs well with around 1000 rows of data
I am unable to decide whether create multiple jobs or send data in chunks ............
plz share ur thoughts on this.
Web Services Transformer stage not able to handle huge data
Large quantities of data over SOAP is not such a good idea regardless of the client tooling....
....but you might want to find out if the provider of the service offers a chunking option...some do....I recall writing a job to call SF.com (before we had a pack for it) and they had this cool option (or similar) of asking for a "batch" of rows, passing a max-rows parameter....if the rows sent back exceeded the batch, they include in their parameter list a total number of rows and then I could go back and ask for "page numbers".......
If you had such options you could control it entirely.
Also....put your right mouse on the WSTransformer and open it in "grid style"....not sure what release you are on, but hopefully you will see a java options property. In there, you can increase the heapsize for the jvm under the covers (I always have to re-check the web and my java references for the syntax, but it's something like "-xmsMax nnnn" ). That may be the issue too --- although at some point even that will run out.
Ernie
....but you might want to find out if the provider of the service offers a chunking option...some do....I recall writing a job to call SF.com (before we had a pack for it) and they had this cool option (or similar) of asking for a "batch" of rows, passing a max-rows parameter....if the rows sent back exceeded the batch, they include in their parameter list a total number of rows and then I could go back and ask for "page numbers".......
If you had such options you could control it entirely.
Also....put your right mouse on the WSTransformer and open it in "grid style"....not sure what release you are on, but hopefully you will see a java options property. In there, you can increase the heapsize for the jvm under the covers (I always have to re-check the web and my java references for the syntax, but it's something like "-xmsMax nnnn" ). That may be the issue too --- although at some point even that will run out.
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
then it might be something else....try other services...try other data...I've done 10 and 20meg payloads in the past without a problem.
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>