Search found 53125 matches
- Mon Apr 19, 2004 4:24 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: complex job Example
- Replies: 2
- Views: 994
- Sat Apr 17, 2004 4:55 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: COMP-3 conversion in Datastage - Strategies
- Replies: 5
- Views: 3853
- Fri Apr 16, 2004 3:26 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Strip non-numeric characters
- Replies: 4
- Views: 1607
- Fri Apr 16, 2004 3:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Please help us figure out this DSR_JOB(Action=5) error!!!
- Replies: 3
- Views: 1632
Objects used by DataStage clients to communicate with the server are exposed by an ActiveX control. If an OLE Automation error occurs, it ceases to be possible to use the current connections; you need to open new instances of the clients. DSR_JOB is one of a number of "helper subroutines" on the ser...
- Fri Apr 16, 2004 3:22 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Status = 96 (Crashed)
- Replies: 5
- Views: 1319
- Fri Apr 16, 2004 3:21 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: EBCDIC on windows
- Replies: 7
- Views: 2686
Without line terminators this is tricky. I'd be inclined to pre-process the file in a before-job subroutine, pretty much along the lines I indicated, but using ReadBlk to read 540 bytes, then 540 bytes, then 3 bytes, then as many "record" sized chunks as necessary. And the ASCII function to covert t...
- Fri Apr 16, 2004 3:16 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Can i write to a hashed file from multiple links
- Replies: 1
- Views: 716
ALL writes to a hashed file are destructive overwrites, whether from one source or many. If the key being written already exists on the hashed file, then that row is replaced entirely with the columns of the new row. If this is what you want to happen then, yes, it is "safe". A hashed file behaves l...
- Fri Apr 16, 2004 3:12 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: whether ORABULK call the sqlldr explicitly or implicitly ?
- Replies: 1
- Views: 846
The ORABULK stage produces the control and data files for sqlldr. It does not have any implicit capability to invoke sqlldr. Most of the other bulk loader stage types do have this capability, but the ORABULK stage does not. It is for this reason that you need to use ExecSH (or ExecDOS if on Windows)...
- Thu Apr 15, 2004 4:03 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: EBCDIC on windows
- Replies: 7
- Views: 2686
Another possibility is that you read the sequential file as if it contains just one column. You can process that column with DataStage's ASCII() function in a Transformer stage's stage variable, which should yield the ASCII equivalent. You can then decompose the row using substring and vertical pivo...
- Thu Apr 15, 2004 3:58 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Missing records in Hash file stage
- Replies: 2
- Views: 784
Use the monitor, or show performance statistics in Designer, or the "active stage finishing" event in the job log, to tell us how many rows were read from the hashed file stage. Also tell us whether there were any selection criteria used in the hashed file stage, if there were any constraints in a f...
- Thu Apr 15, 2004 3:55 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: hashcount error
- Replies: 9
- Views: 2596
Essentially, my recommendation was because it's a decided nuisance to have to reset jobs. I prefer that the controlling job can retain control, report anything that needs to be reported can be reported, and perhaps even other tasks executed to remedy a situation that might otherwise have generated a...
- Thu Apr 15, 2004 1:28 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: rows are not displayed on oracle when loaded using ORABULK
- Replies: 1
- Views: 720
What you are seeing in DataStage is the number of rows written to the data file for sqlldr. The ORABULK stage creates the control and data files. It does not, however, run sqlldr to get these rows into Oracle. You can invoke sqlldr from the After-Stage Subroutine section of the ORABULK stage. Choose...
- Wed Apr 14, 2004 4:42 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Sequence number
- Replies: 1
- Views: 673
- Wed Apr 14, 2004 4:35 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Status of Job Second time around
- Replies: 1
- Views: 657
Welcome aboard! If you're writing your own job control code, to determine the prior status of a job you: attach the job (DSAttachJob) determine it's previous status (DSGetJobInfo) There are 12 possible statuses of a job, constants for which are defined in JOBCONTROL.H in the DSINCLUDE directory. Sea...
- Wed Apr 14, 2004 4:28 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: LIST UV.ACCOUNT
- Replies: 11
- Views: 4792