Search found 92 matches

by mikegohl
Tue Feb 21, 2012 4:43 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: TimestampOffsetBySeconds and TimestampOffsetBySeconds
Replies: 2
Views: 3110

ray.wurlod wrote:I understand that DataStage regards 00:00:00 as if it were 24:00:00. So you'll need to test for the date. ...
That works. I check the time part and when it is 00:00:00.000000 then i subtract 86400.000001
by mikegohl
Tue Feb 21, 2012 3:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: TimestampOffsetBySeconds and TimestampOffsetBySeconds
Replies: 2
Views: 3110

TimestampOffsetBySeconds and TimestampOffsetBySeconds

I have a requirement to subtract .000001 microsecond from my current timestamp. I have tried both the TimestampOffsetBySeconds and TimestampOffsetBySeconds functions: TimestampOffsetByComponents(StringToTimestamp(DSLink4.timestampraw,"%yyyy-%mm-%dd %hh:%nn:%ss.6") ,0,0,0,0,0,-0.000001) Tim...
by mikegohl
Mon Aug 10, 2009 8:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Job Status 99
Replies: 18
Views: 11468

No,

I've moved to a new client. The last thing we did was change the auto purge setting. The problem might of went away.

Sorry!
by mikegohl
Fri Jun 12, 2009 4:14 pm
Forum: General
Topic: Error when running Data stage server job
Replies: 22
Views: 11941

Good catch. I should read the entire post.
by mikegohl
Fri Jun 12, 2009 4:03 pm
Forum: General
Topic: Error when running Data stage server job
Replies: 22
Views: 11941

You will find it in the transform. Stage properties for the reference link.
by mikegohl
Fri Jun 12, 2009 3:39 pm
Forum: General
Topic: Error when running Data stage server job
Replies: 22
Views: 11941

Do you have the "Reference link with multiple row result set" box checked?
by mikegohl
Thu Jun 11, 2009 10:38 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: getting count from seq file
Replies: 26
Views: 9938

I like to use an after-job subroutine and the DSGetLinkInfo. I'm sure others will suggest to use UNIX utilities.
by mikegohl
Thu Jun 04, 2009 3:43 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Job Status 99
Replies: 18
Views: 11468

Sorry to not get back to you sooner. The tech supporting my case just keeps sending me more questions. I will update the thread if I ever get an answer.
by mikegohl
Tue Jun 02, 2009 3:11 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: do you usually keep partitioning on "Auto" in your
Replies: 5
Views: 3063

I would recommend that you read the Parallel Developer's Guide.
by mikegohl
Mon Jun 01, 2009 3:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Job Status 99
Replies: 18
Views: 11468

I have the exact same case open with IBM.
by mikegohl
Wed May 27, 2009 8:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: While inserting; deadlock is happening
Replies: 2
Views: 2256

Could be contention in you own process. Tell us a little about the design.
by mikegohl
Thu May 21, 2009 4:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Hash on Sort Stage
Replies: 1
Views: 1961

Do you know what the partition and join keys are when the datasets were written? You can partion all Datasets by Seq from the start. This will avoid to repatition before the second join. You can still sort the data on Seq and Date.
by mikegohl
Wed May 13, 2009 3:42 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DSGetLinkInfo error
Replies: 2
Views: 4094

Try to use the stagename before the container or use a link in the container with the container name first.

Ans = DSGetLinkInfo (handleJob, ContainerName.StageName, LinkName, DSJ.LINKROWCOUNT)
by mikegohl
Tue May 12, 2009 9:51 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: update and insert
Replies: 4
Views: 2796

Somehow, the key has to allow for two records to be in the database at the same time. What makes the rows unique in the database?