Search found 15603 matches
- Wed Sep 05, 2012 11:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: PadString is constrained to the input column length
- Replies: 4
- Views: 1625
- Wed Sep 05, 2012 10:40 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: RCP and performing a database update
- Replies: 7
- Views: 8845
I've managed to solve the problem posted and it wasn't quite as onerous as I'd expected. The solution is to use a generic stage and use the parameter value in the statement there. 1. Write the output to the DB stage (whichever one you choose) using RCP but explicitly declaring the key column in an u...
- Wed Sep 05, 2012 10:35 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: what are the advantages of datastage 8.5 server jobs
- Replies: 5
- Views: 2246
- Wed Sep 05, 2012 9:33 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DS Function to reformat SSN
- Replies: 6
- Views: 2116
- Wed Sep 05, 2012 8:40 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: PadString is constrained to the input column length
- Replies: 4
- Views: 1625
- Wed Sep 05, 2012 7:20 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: RCP and performing a database update
- Replies: 7
- Views: 8845
After posting my question I have thought a bit more about the problem. In kwwilliams method I would need to store the record columns somewhere and pass them to the job, not a task I'd like to perform. I think that I might be able to do this using the generic stage and issuing the keys explicitly and...
- Wed Sep 05, 2012 5:01 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Hash File Size
- Replies: 6
- Views: 2663
- Wed Sep 05, 2012 4:23 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error in build output schema from DataSetDef
- Replies: 17
- Views: 11711
- Wed Sep 05, 2012 1:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: RCP and performing a database update
- Replies: 7
- Views: 8845
RCP and performing a database update
I'm writing a generic job that takes a table name and key column name as input parameters. The tables to process all have different key column names. The job reads a table, performs some processing and then should write the updated record back to the table. I can get this job to function correctly w...
- Wed Sep 05, 2012 1:21 am
- Forum: General
- Topic: Realistic uppper limit quantity of jobs in on Project
- Replies: 7
- Views: 1928
- Wed Sep 05, 2012 1:17 am
- Forum: General
- Topic: DataStage project creation error
- Replies: 23
- Views: 7285
- Wed Sep 05, 2012 1:15 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Hash File Size
- Replies: 6
- Views: 2663
- Sat Sep 01, 2012 9:39 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Out Of Memory in ODBC Enterprise Stage
- Replies: 5
- Views: 1901
- Fri Aug 31, 2012 10:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Out Of Memory in ODBC Enterprise Stage
- Replies: 5
- Views: 1901
- Fri Aug 31, 2012 7:14 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Out Of Memory in ODBC Enterprise Stage
- Replies: 5
- Views: 1901
Is this the read or the write stage? Are you doing any SQL? How many rows of data or megabytes of data in the file and how long before the failure occurs after starting the job? If this is the read stage, then make a copy of your job with just the Read and a peek stage and see if the problem persists.