Search found 42189 matches

by chulett
Fri Sep 15, 2017 6:59 am
Forum: General
Topic: Timer already cancelled error while trying to login
Replies: 6
Views: 2704

Figured that would be the answer... or at least an answer to your resource issue.
by chulett
Thu Sep 14, 2017 1:59 pm
Forum: General
Topic: Timer already cancelled error while trying to login
Replies: 6
Views: 2704

That's a Java error and perhaps a bug, probably best to involve your official support provider.
by chulett
Thu Sep 14, 2017 11:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage Parallel schema file - indicate # of rows to skip
Replies: 8
Views: 4523

While Informatica has a "number of records to skip" for flat files, I don't recall DataStage having anything other than the "First line is column headers" true/false option. Perhaps a question for your official support provider?
by chulett
Thu Sep 14, 2017 7:42 am
Forum: General
Topic: Calling job from Java
Replies: 6
Views: 2548

In your shoes I would be opening a support case while waiting for additional input from the community...
by chulett
Wed Sep 13, 2017 3:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Handling Nulls for decimal fields in transformer stage
Replies: 2
Views: 2771

From what I recall, you would need to define them as string fields in the flat file and then format them accordingly.
by chulett
Wed Sep 13, 2017 11:36 am
Forum: General
Topic: ETL Job rowcount and other details
Replies: 8
Views: 4301

Sure but it's messy and painful... which is the whole reason that the DSODB was added to the product. You can leverage the API for this, either from BASIC routines or from the command line depending on where your skillset lies, equivalent functions exist in both. For BASIC routines, search the docum...
by chulett
Wed Sep 13, 2017 6:42 am
Forum: General
Topic: ETL Job rowcount and other details
Replies: 8
Views: 4301

I haven't been through all of the documentation (thanks Paul) but perhaps something here might help.
by chulett
Tue Sep 12, 2017 4:58 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Unicode with RCP?
Replies: 11
Views: 5954

It's got to be BYTE semantics or it would be working now, I'd wager.
by chulett
Tue Sep 12, 2017 1:29 pm
Forum: Information Analyzer (formerly ProfileStage)
Topic: Quality Analysis error
Replies: 9
Views: 7207

That's what happens with you try to negotiate with a machine. :wink:
by chulett
Mon Sep 11, 2017 10:04 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Unicode with RCP?
Replies: 11
Views: 5954

Have you tried... specifying "ustring" in the schema file? I'm sure there's more to it than that, but that's the first thing that comes to mind.
by chulett
Mon Sep 11, 2017 8:33 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading and writing data into csv files?
Replies: 14
Views: 6638

That seems oddly... restrictive... to me. :?

Let's see what others have to say.
by chulett
Mon Sep 11, 2017 7:15 am
Forum: General
Topic: Unable to view Ray's comment as a Premium Member
Replies: 5
Views: 1801

It is a bit odd but it has been with us since the beginning... since you're suddenly at a different domain and not logged in there, it blocks all premium content. I've been fixing them since I was granted the power to do so, never really thought to announce the issue until it came up here.
by chulett
Mon Sep 11, 2017 6:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading and writing data into csv files?
Replies: 14
Views: 6638

Not that I am aware of, each is treated as a separate 'table' in essence from what I recall.
by chulett
Sun Sep 10, 2017 8:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Populate values of next record in previous record
Replies: 4
Views: 2512

You can't do anything with the "previous" records unless we're talking a database as the target. I'm assuming we're talking about files here so the suggestion is to sort the data descending... i.e. process them in the opposite order posted so that the date in the current record can be used...