Search found 15603 matches
- Wed Aug 02, 2006 4:43 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Timeout parameter
- Replies: 6
- Views: 6370
What Oracle connection method are you using? Is it ODBC or an Oracle stage? I think the timeout might occur when going through ODBC and not when accessing using the builtin stages. I know I've had extremely long query times before without any timeouts, so this isn't an insurmountable problem. Perhap...
- Wed Aug 02, 2006 4:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error While running multiple instances
- Replies: 30
- Views: 11242
Krish, you can find out the size of your /tmp directory by issuing a "df -g /tmp". This will show (in Gigabytes) how much space you have in total and available. Your DataStage scratch directories are defined in your APT_CONFIG file so I can't help you with an exact command, you'll have to check that...
- Wed Aug 02, 2006 4:22 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Jobs execution is very slow
- Replies: 1
- Views: 460
Your client connection to the server doesn't impact job runtime performance. If your source and target data for the jobs are on machines connected through a slow SSL protocol then it might affect performance. You have just asked the car equivalent question "My car is running slower this week. Why?" ...
- Wed Aug 02, 2006 4:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Error While running multiple instances
- Replies: 30
- Views: 11242
- Wed Aug 02, 2006 4:06 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ds_seqopen() - error in 'open()' on named pipe read links
- Replies: 8
- Views: 3714
The timeout settings in the sequential file {pipe} stage should be the default of 60; I would avoid using 0. I don't know if that change will directly affect your job but it should be done. The unhandled interrupt in ds_seqopen() might have something to do with the timing or having that pipe still a...
- Wed Aug 02, 2006 3:57 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Lookup table not returning rows
- Replies: 3
- Views: 691
If you are inserting rows and reading them in the same job, make sure your commit size is 1 so that changes are immediately found. If you still aren't getting matches you will need to put in some debugging information (or use the designer interactive debugger directly) to see exactly which lookups a...
- Wed Aug 02, 2006 1:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Constrain is not working in Transformer
- Replies: 6
- Views: 1520
Re: Constrain is not working in Transformer
...How to check for hidden characters? You need to tell us what you consider "hidden" characters to be. You can to a LEN(TRIM(In.Column)) on a CHAR(n) field to see how many non-padded values are in the field, assuming you've left your padding to be spaces. But once you have that value you will need...
- Wed Aug 02, 2006 1:46 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ds_seqopen() - error in 'open()' on named pipe read links
- Replies: 8
- Views: 3714
- Wed Aug 02, 2006 1:40 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job aborting when running with large data sets
- Replies: 7
- Views: 2286
- Wed Aug 02, 2006 1:37 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DataStage Jobs failure due to Broken Pipe
- Replies: 5
- Views: 15316
All PX Jobs are Failing due to Broken pipe Error. Usually broken pipes are not the cause of problems but the most visible symptoms. Something is causing your processes to fail, and then their side of the pipes are being closed down and the other side of the process reports this as an error. There s...
- Wed Aug 02, 2006 1:31 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Autoinstall DSjobs anf folder
- Replies: 2
- Views: 541
- Wed Aug 02, 2006 1:27 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Hash File reading
- Replies: 1
- Views: 734
- Tue Aug 01, 2006 9:15 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: can you explain with this ArchiveFiles routine
- Replies: 5
- Views: 1591
- Tue Aug 01, 2006 7:41 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: CFF Record
- Replies: 6
- Views: 1423
- Tue Aug 01, 2006 7:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Insert the records in to the tables thruogh Routines?
- Replies: 6
- Views: 1235
I still think it is a bad idea to do it the way you intend. How about doing it a bit differently - in your BASIC routines you write the logging information to a text file. Then, either as an after-job or other call you will start a simple DataStage job that will read this text file and then load it ...