Search found 42189 matches

by chulett
Mon Jul 18, 2011 3:43 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle enterprise stage.
Replies: 6
Views: 2271

For insert only, the trick (I believe) is to use the "User-defined Update Only" option and then make your actual user-defined SQL an insert statement.
by chulett
Mon Jul 18, 2011 3:19 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Corrupted hashed files after server crash
Replies: 7
Views: 5219

Yes, I've seen it personally and it has been reported here as well by several others, I do believe. All it takes is the loss of the hidden .Type30 file inside the directory to cause this issue as the hashed file is no longer a dynamic one. It can then fall back on this funky (type 1? 19?) "hash...
by chulett
Mon Jul 18, 2011 1:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel job not able to compile
Replies: 5
Views: 2469

What exactly happens when you try to "compile"? Does the job have a Transformer in it by chance? If so, you'll need to have a supported C++ compiler installed and properly configured before that will work. For the view data problem, what stage are you using? What errors are you getting? Ha...
by chulett
Mon Jul 18, 2011 10:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Oracle enterprise stage.
Replies: 6
Views: 2271

No setting here - the first action must fail for the second to happen, so you'll need a unique index over your key(s) to keep the duplicate inserts from happening.
by chulett
Mon Jul 18, 2011 10:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Create Multiple years data from single record
Replies: 4
Views: 2538

Actually, the amount per month is 13.61333333333333... so you either need to alternate rounding up and then down across the months to even out the difference across the year or you need to lump the difference into the first or last month - 13.66 - otherwise you'll be short. Isn't this just two outpu...
by chulett
Mon Jul 18, 2011 9:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change APT_DUMP_SCORE value at job level
Replies: 5
Views: 2398

DIVIHN wrote:I did it I have set APT_DUMP_SCORE at Job level but still the job using project level value.
You add it to the job and then override the value in the job itself to be True via the Director.
by chulett
Mon Jul 18, 2011 8:42 am
Forum: General
Topic: Finding the projects,jobs,logs in unix
Replies: 13
Views: 3717

What about them? Look inside the directories, there's nothing mysterious in there.
by chulett
Mon Jul 18, 2011 8:06 am
Forum: General
Topic: Finding the projects,jobs,logs in unix
Replies: 13
Views: 3717

You can't corrupt the job by messing with its log. As for your questions - try it, let us know... though I have no idea why you'd want to do that.
by chulett
Mon Jul 18, 2011 7:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: PX not running on Windows 2003
Replies: 2
Views: 1048

Did you install 7.5.2 or 7.5x2? The latter is the only pre-8.x release that will actually run PX jobs on Windows. Since it looks to be attempting to run them, I would guess the latter is actually correct.

Post your config file. Have you "tested" it using the built-in tester utility?
by chulett
Mon Jul 18, 2011 7:33 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Change APT_DUMP_SCORE value at job level
Replies: 5
Views: 2398

Same answer for any of the APT environment variables. There are several that you should be doing this with in every job so you have the option to change them at runtime if needed. Don't have a list off the top of my head any more but APT_DUMP_SCORE and APT_CONFIG_FILE come to mind...
by chulett
Mon Jul 18, 2011 7:28 am
Forum: General
Topic: Finding the projects,jobs,logs in unix
Replies: 13
Views: 3717

What kind of "impact" are you thinking would happen? Logs need to be cleared periodically, hence the Auto Purge as the primary mechanism for that. It supports two purge types, the same two types you can do manually from the Director - by days or by runs - three types if you count clearing ...
by chulett
Mon Jul 18, 2011 7:21 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Abnormal termination of stage
Replies: 4
Views: 2850

meet_deb85 wrote:BUt IF i am adding any other SQL IT is failing with warning no fatal..same as pasted earlier..
It's too long, you'll need to find another way to do this... a view comes to mind.
by chulett
Mon Jul 18, 2011 7:00 am
Forum: General
Topic: Finding the projects,jobs,logs in unix
Replies: 13
Views: 3717

Look inside them.

Yes, the "LOG" one is a dynamic hashed file holding the job's log and you don't want to be deleting any of them.
by chulett
Mon Jul 18, 2011 6:58 am
Forum: General
Topic: One more iteration in the loop activity
Replies: 2
Views: 1256

That's because of the trailing delimiter, it means there is another field coming. Remove it. You should have n-1 delimiters for n values in the list of values.
by chulett
Sun Jul 17, 2011 9:32 pm
Forum: General
Topic: Restartability of User Variables
Replies: 4
Views: 1701

Split from this topic.