Any limit on Target target tables in a job

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
chpraveen.msc
Participant
Posts: 26
Joined: Tue Nov 08, 2005 5:36 am

Any limit on Target target tables in a job

Post by chpraveen.msc »

Hi All,
I have designed a Parallel job which loads data into 15 oracle Targets tables and the data had been pulled out using 15 ODBC source stages (Source Main Frame DB)Some basic transformation was done using Transformer stage.Performance wise will there be any impact on the DS server and the resources it consume to run this job.Volume wise the data is quite high(in millions)
Is there any limit on the number of source stages and Target stages per job. Any ideal solution apart from splitting it into sub jobs

Thanks in Advance.
prabu
Participant
Posts: 146
Joined: Fri Oct 22, 2004 9:12 am

Re: Any limit on Target target tables in a job

Post by prabu »

chpraveen.msc wrote:Hi All,

Is there any limit on the number of source stages and Target stages per job. Any ideal solution apart from splitting it into sub jobs

Thanks in Advance.
IMHO, there is NO limit to the number of target tables used in a single job.

saying that, you need to consider splitting the load into multiple jobs.
consider the following for doing so

a) modularization
b)handling dependent jobs better
c)exception/failure handling
b)taking advantage of parallel processes from OS
e)eliminating thread/process management at DS server level
f)eliminating resource allocation at DS server level for different jobs in the same job


regards,
Prabu
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

High volume, more stages will always be a challenge to performance.
Is it all one to one, or joined together and later diversified.
If its one source to one target table, its better to have different jobs for many beneficial factors. You can run the jobs parallely as well.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Bad design. Load in different jobs. You are loading 15 tables with data size ~ 30 Million. Restartability will be a problem. If a record fails in any one of the tables and the job aborts, its going to affect the other loads as well.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Post Reply