Loading Nested Tables

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
jerome_rajan
Premium Member
Premium Member
Posts: 376
Joined: Sat Jan 07, 2012 12:25 pm
Location: Piscataway

Loading Nested Tables

Post by jerome_rajan »

Hi,

I'm looking for advice on whether a Server Job or a Parallel job is more capable to efficiently load an Oracle nested table.

Any pointers to possible design approaches would be very helpful. Thank you
Jerome
Data Integration Consultant at AWS
Connect With Me On LinkedIn

Life is really simple, but we insist on making it complicated.
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

It would be better if you try to explain the scenario.

else if you are calling for blind votes, I would vote for parallel.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

They are tables that hold tables - basically sub-tables / multivalued data. For the 'efficient' part, I would think we'd need to know the volumes involved.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Are there SQL extensions in Oracle for accessing the nested tables? Something like the UNNEST keyword in UniVerse?
If so, you can probably supply user-defined SQL to take advantage of the nested table functionality.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
jerome_rajan
Premium Member
Premium Member
Posts: 376
Joined: Sat Jan 07, 2012 12:25 pm
Location: Piscataway

Post by jerome_rajan »

We have internally decided to use parallel jobs. The only reason why the 'parallel or server' question came up was because all of the existing ETL has been built using server jobs and the support guys were very skeptical due to their lack of experience in handling parallel jobs. We've now convinced the client about the many advantages that the parallel infrastructure brings with it.

My question now, is more towards how to read from a source file and load into a nested table. A sample of the data is



The first 3 characters of each record are the record type. We are looking to have a nested table of D10 records and another of D20 records with H10 being the parent

--------------------------
Craig,
Can you please move the thread to the parallel forum?
Last edited by jerome_rajan on Fri Jul 29, 2016 4:44 am, edited 1 time in total.
Jerome
Data Integration Consultant at AWS
Connect With Me On LinkedIn

Life is really simple, but we insist on making it complicated.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Treat this as a parent/child problem. Insert the parent records into the base table and the child records into the nested table. You may like to use a Distributed Transaction stage.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
jerome_rajan
Premium Member
Premium Member
Posts: 376
Joined: Sat Jan 07, 2012 12:25 pm
Location: Piscataway

Post by jerome_rajan »

Thank for the guidance, Ray. I'll try this approach and post tangible updates, if any.
Jerome
Data Integration Consultant at AWS
Connect With Me On LinkedIn

Life is really simple, but we insist on making it complicated.
jerome_rajan
Premium Member
Premium Member
Posts: 376
Joined: Sat Jan 07, 2012 12:25 pm
Location: Piscataway

Post by jerome_rajan »

Stupid beginner questions since this is my first time with the DT stage.

1. Can a Distributed Transaction Stage be used without a MQ Connector?
2. What purpose would the DT stage serve in my case as against a standard Oracle connector?
Jerome
Data Integration Consultant at AWS
Connect With Me On LinkedIn

Life is really simple, but we insist on making it complicated.
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

Sorry for not being descriptive and yes I wanted to know about the data volumes before selecting one of them.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
jerome_rajan
Premium Member
Premium Member
Posts: 376
Joined: Sat Jan 07, 2012 12:25 pm
Location: Piscataway

Post by jerome_rajan »

I wasn't able to use the DT stage due to lack of sufficient knowledge. But I managed to implement it using a sequence of oracle connectors with the parent/child approach. Thank you for your timely assistance.
Jerome
Data Integration Consultant at AWS
Connect With Me On LinkedIn

Life is really simple, but we insist on making it complicated.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Done.

For the record, Oracle has a TABLES() function to access and 'unnest' these but loading you use normal INSERT statements. Well, mostly normal as you do have to worry about setting up the constructor properly. :wink:

Good job.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply