How to auto-generate metadata?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
varaprasad
Premium Member
Premium Member
Posts: 34
Joined: Fri May 16, 2008 6:24 am

How to auto-generate metadata?

Post by varaprasad »

Hi, My client has a requirment in which, whenever a source table definition changes (when new columns are added or deleted), it should get updated automatically in the DS job (they are using PX), and should reflect in the target tables. This should happen without making any changes to the job/code, explicitly.
Is there any way to do this? Please suggest.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

I think they need the RMM stage.

To quote from that classic film The Castle - "tell 'em they're dreaming".
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
throbinson
Charter Member
Charter Member
Posts: 299
Joined: Wed Nov 13, 2002 5:38 pm
Location: USA

Post by throbinson »

To quote Walt Disney, "Sometimes dreams do come true".
But for DataStage this dream can quickly turn into a nightmare. It depends on the ease of obtaining the changed metadata. Is it from a DBMS? How is the target metadata maintained and updated?
varaprasad
Premium Member
Premium Member
Posts: 34
Joined: Fri May 16, 2008 6:24 am

Post by varaprasad »

throbinson wrote:To quote Walt Disney, "Sometimes dreams do come true".
But for DataStage this dream can quickly turn into a nightmare. It depends on the ease of obtaining the changed metadata. Is it from a DBMS? How is the target metadata maintained and updated?
The source data is on ORACLE & DB2 and loaded into flat files in the target server. What they need is, they do not want the manual intervention to change the Datastage code.
I just want to know if ther's any possibility to do this using the schema files.
OddJob
Participant
Posts: 163
Joined: Tue Feb 28, 2006 5:00 am
Location: Sheffield, UK

Post by OddJob »

Combining schema files and Runtime Column Propogation (RCP) could solve your issue.
singhald
Participant
Posts: 180
Joined: Tue Aug 23, 2005 2:50 am
Location: Bangalore
Contact:

Post by singhald »

Hi Ray,

what is that RMM stage ?
Regards,
Deepak Singhal
Everything is okay in the end. If it's not okay, then it's not the end.
throbinson
Charter Member
Charter Member
Posts: 299
Joined: Wed Nov 13, 2002 5:38 pm
Location: USA

Post by throbinson »

You could build a DataStage job to query the Oracle system tables for the source table. When a change is detected the job would create the proper schema files. Another DataStage job, RCP enabled, would read the source Oracle table and write out the fields to the just generated schema file in a Sequential stage. Easy to design far more difficult to reliably build and implement.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

singhald wrote:what is that RMM stage ?
An old joke - the Read My Mind stage.
-craig

"You can never have too many knives" -- Logan Nine Fingers
dsusr
Premium Member
Premium Member
Posts: 104
Joined: Sat Sep 03, 2005 11:30 pm

Post by dsusr »

OddJob wrote:Combining schema files and Runtime Column Propogation (RCP) could solve your issue.
You have to use some good shell scripting to resolve this issue....

First you have to read the schema from Oracle and have to generate the Schema files which were then used in the generic DataStage job with RCP enabled but all this will work fine if its just direct loading or extraction.

But if there are some transformations then you may need to maintain the rules for each of the tables and have to try generating the transformation files at runtime which can then be used by DataStage job.
Post Reply