Search found 42189 matches
- Tue Jun 17, 2014 11:49 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: MQHRF2 header write using mq connector
- Replies: 5
- Views: 2117
- Tue Jun 17, 2014 11:46 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Huge amount of processes generated
- Replies: 6
- Views: 2589
- Tue Jun 17, 2014 9:59 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Huge amount of processes generated
- Replies: 6
- Views: 2589
- Tue Jun 17, 2014 7:01 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Huge amount of processes generated
- Replies: 6
- Views: 2589
- Tue Jun 17, 2014 6:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: GET N rows in output based on counts
- Replies: 11
- Views: 3541
- Mon Jun 16, 2014 6:00 pm
- Forum: General
- Topic: ps -ef command error
- Replies: 5
- Views: 2075
You could also look into what I seem to recall a Sequence job will do in a similar situation - use DSWaitForJob() but pass it a list of job handles to wait for. And it does. However, not aware of an equivalent from the command line, you'd need to make use of a routine and DSRunJob() to initiate them...
- Mon Jun 16, 2014 3:32 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: GET N rows in output based on counts
- Replies: 11
- Views: 3541
- Mon Jun 16, 2014 2:53 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: GET N rows in output based on counts
- Replies: 11
- Views: 3541
- Mon Jun 16, 2014 2:47 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ODBC - Transaction Handling
- Replies: 4
- Views: 1828
Not going to work that way, the deletes will be committed and then your inserts will start. They both need to be 'inside' the stage to be in the same transaction and that requires two links. The delete link needs to be ordered to run first and only needs to send 1 row to trigger the deletes. Which O...
- Mon Jun 16, 2014 12:50 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SQL error "-246" in DB2 Insert
- Replies: 3
- Views: 1869
- Mon Jun 16, 2014 12:48 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ODBC - Transaction Handling
- Replies: 4
- Views: 1828
- Mon Jun 16, 2014 11:21 am
- Forum: General
- Topic: Shell script to start a job using dsjob
- Replies: 13
- Views: 37036
- Mon Jun 16, 2014 9:03 am
- Forum: General
- Topic: command line trigger for job
- Replies: 11
- Views: 4186
That seems... fine... if all it is doing is make sure things are ready to go as you have no actual RUN mechanism in there. Have you tested it with a non-existent job? Asking because I looked for 'ERROR' in the jobinfo output rather than any kind of status value. And just a side note if this ends up ...
- Mon Jun 16, 2014 8:49 am
- Forum: General
- Topic: Change a default value by script
- Replies: 11
- Views: 2658
You can edit / change a job export through whatever mechanism you have skills in and are comfortable with. As to the compile question - yes but only on the client side.
- Mon Jun 16, 2014 6:52 am
- Forum: General
- Topic: Change a default value by script
- Replies: 11
- Views: 2658
Better answer would be to not be reliant on default values, sounds like a switch to leveraging parameter value files may be in order. There's nothing inherently different about editing a dsx vs. an isx, both are doable and carry the same level of risk. You keep saying 'routine' which has a very spec...