Lockout when updating rows using DB2 Connector

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
manuel.gomez
Premium Member
Premium Member
Posts: 291
Joined: Wed Sep 26, 2007 11:23 am
Location: Madrid, Spain

Lockout when updating rows using DB2 Connector

Post by manuel.gomez »

Hello guys,

I am having problems when updating information using DB2 Connector.

If I execute the job with a configuration file having more than one node, we get as many tasks raised in the database as partitions we are working with, and they all lock between them. Finally, job aborts after timeout.

When executing with just one partition, job finishes succesfully. I guess I must be doing something wrong, as I dont belive this connector has to be used as "one-node" configuration

Can you guys help?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

How are you partitioning your data? Hash partitioned over the key fields?
-craig

"You can never have too many knives" -- Logan Nine Fingers
manuel.gomez
Premium Member
Premium Member
Posts: 291
Joined: Wed Sep 26, 2007 11:23 am
Location: Madrid, Spain

Post by manuel.gomez »

chulett wrote:How are you partitioning your data? Hash partitioned over the key fields? ...
I just left auto partitioning

I will try with method you just suggested
manuel.gomez
Premium Member
Premium Member
Posts: 291
Joined: Wed Sep 26, 2007 11:23 am
Location: Madrid, Spain

Post by manuel.gomez »

Mmmmmm....I tried with explicit Hash partitioning over key fields (did not perform sort), but got same result: two tasks locking each other

:(
manuel.gomez
Premium Member
Premium Member
Posts: 291
Joined: Wed Sep 26, 2007 11:23 am
Location: Madrid, Spain

Post by manuel.gomez »

I tried with transaction and array size set to 1, and it obviously works, but very slowly

I also tried to do "Bulk Load", but as variable $APT_CONFIG_FILE is included in my parameter set with all database connection parameter, I just execute this single job with one node (values are taken from prior sequence)

Can DB2 Enterprise stage be used in sequential mode? Any other?

Thanks a lot
Matt.C
Participant
Posts: 1
Joined: Sun Dec 19, 2010 4:05 pm

Re: Lockout when updating rows using DB2 Connector

Post by Matt.C »

We are experiencing similar problems with the DB2 Connector stage. We found that applying an index on the db2 table based on the update criteria and applying Hash partitioning worked.
Alternatively we went into the Advanced tab and changed the Execution method to Sequential. (NB when you open the connector properties the default view is based on the Input link, clicking on the Database icon, top left, will bring up the tab you need to change)

Matt
greggknight
Premium Member
Premium Member
Posts: 120
Joined: Thu Oct 28, 2004 4:24 pm

Post by greggknight »

I had the same issue on SQL server
I added the option maxdup = 0
this option limited the sql to a single process instead of 10
maybe there is a sim option in DB2
"Don't let the bull between you and the fence"

Thanks
Gregg J Knight

"Never Never Never Quit"
Winston Churchill
lpadrta
Premium Member
Premium Member
Posts: 38
Joined: Thu Nov 07, 2002 7:39 am
Location: Jacksonville, FL

Post by lpadrta »

Have you installed the DataStage 8.5 Fix Pack 1? The doc makes it sound like it addresses some issues with the DB2 Connector.

http://publib.boulder.ibm.com/infocente ... 85fp1.html
Post Reply