I have to load data from oracle(source) to teradata(target). The jobs should be dynamic/flexible and pick the metadata information at run time i.e. If a change has made to the data model it shouldn't effect the DataStage job design. Say Col A was previously varchar 5 and now the lenght has increased to varchar10 Or an additional column has been added to the oracle table and we would like to pass it along to the target table. Is there any way to pick the metadata at runtime and make the datastage code flexible?
We are using Datastage 7.5.2 PX.
Picking metadata at run time
Moderators: chulett, rschirm, roy
-
ray.wurlod
- Participant
- Posts: 54595
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
ray.wurlod
- Participant
- Posts: 54595
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Seriously, how often are table definitions changed in a real production environment.
The usual (professional) approach is that such changes are planned, and advised to everyone involved, including ETL developers so that they can assess the impact. A proper software development lifecycle should apply to ETL programming just as much as to any other form of programming.
What happens (for example) in the Business Intelligence tool that touches this amoeba table? Is that expected to be just as dynamic?
The usual (professional) approach is that such changes are planned, and advised to everyone involved, including ETL developers so that they can assess the impact. A proper software development lifecycle should apply to ETL programming just as much as to any other form of programming.
What happens (for example) in the Business Intelligence tool that touches this amoeba table? Is that expected to be just as dynamic?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
