Search found 67 matches

by venkates.dw
Thu Dec 20, 2012 9:56 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Internal Error:blockSizeActual >=v4BlockHeader::size():C:
Replies: 4
Views: 3674

If it is the Block size issue, the rerun also should fail but rerun is completing successfully.
by venkates.dw
Thu Dec 20, 2012 9:55 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Internal Error:blockSizeActual >=v4BlockHeader::size():C:
Replies: 4
Views: 3674

We are extracting the XML messages from the database.

Also when we rerun the job, its completing successfully.
by venkates.dw
Thu Dec 20, 2012 4:32 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Internal Error:blockSizeActual >=v4BlockHeader::size():C:
Replies: 4
Views: 3674

Internal Error:blockSizeActual >=v4BlockHeader::size():C:

Hi,

Could someone please assist me on the below error?

Internal Error: (blockSizeActual >= v4BlockHeader::size ()): datamgr/partition.C: 442 Traceback: Could not obtain stack trace; check that 'dbx' and 'sed' are installed and on your PATH
by venkates.dw
Tue Apr 03, 2012 10:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: CPU usage for the particular job
Replies: 3
Views: 2705

$APT_PLAYER_TIMING will display the timings per node basis, correct me if i am wrong.
by venkates.dw
Tue Apr 03, 2012 9:45 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: CPU usage for the particular job
Replies: 3
Views: 2705

CPU usage for the particular job

Hi,

I am doing the testing for the different scenarios, for each scenario the execution time is vary. I want to understand what is the root cause for this. Can you please let me know how to findout CPU usage by the particular job?

Thanks.
by venkates.dw
Wed Mar 21, 2012 11:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to reduce the number of database connections for ETL job
Replies: 3
Views: 3030

How to reduce the number of database connections for ETL job

Hi, When I am running the job, it is opening many database connections and going to sleeping mode. I am assuming if we use two database stages ( as source and target) with 3 node configuration, it will open 6 database connections. Once the it reads the data from the database it will not close the co...
by venkates.dw
Wed Jan 04, 2012 9:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

Thanks. I am able to store the value in BigInt filed with 5 digit milli seconds.
by venkates.dw
Wed Jan 04, 2012 8:58 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

Thanks. But it is loading into the mysql database as "9223372036854775807" ( some unknown value). While populating into the with Varchar datatype is populating corretly, but the same value while populating into the BigInt column some unknown value is populating.
by venkates.dw
Wed Jan 04, 2012 4:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

Ok.

The CurrentTimestampMS() function will return the dateTime along with milli seconds. ( ex: 2012-01-04 16:35:00.189089). I want to store this value as 20120104163500189089 in database ( field datatype is BigInt). Please let me know how i can implement this.
by venkates.dw
Wed Jan 04, 2012 3:54 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

ray.wurlod wrote:Is the Scale value in the metadata set to 6? ...
.

For the BigInt do we need to mentioned the scale value?
by venkates.dw
Wed Jan 04, 2012 3:53 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

pandeesh wrote:Can you check whether your time stamp value has the milli seconds?
.

Yes. the timestamp value is having the milli seconds.
by venkates.dw
Wed Jan 04, 2012 2:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

while using the convert function, I am not getting the milli seconds into the result value.

For ex: Convert('-:. ','',2012-01-04 15:43:09.989564). I am getting the result as 20120104154309. But i need the result as 20120104154309989564. Please suggest.
by venkates.dw
Wed Jan 04, 2012 2:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

while using the convert function, I am not getting the milli seconds into the result value.

For ex: Convert('-:. ','',2012-01-04 15:43:09.989564). I am getting the result as 20120104154309. But i need the result as 20120104154309989564. Please suggest.
by venkates.dw
Wed Jan 04, 2012 11:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Convert the Timestamp value to BigInt
Replies: 13
Views: 10491

Convert the Timestamp value to BigInt

Hi,

Can anyone please let me know how to convert the Timestamp value to BigInt. The reson is we are using the MySql as database but we need to store the milli seconds also.

For Ex: 2012-01-04 11:12:13:001 to 20120104111213001

Thanks.
by venkates.dw
Tue Dec 20, 2011 12:29 pm
Forum: General
Topic: Add two add two variables in loop
Replies: 8
Views: 3686

Add two add two variables in loop

Hi,

How can I perform the A= A + B manipulation in loop. Because i have the requirement like i will generate multiple output files using same job and need to take the count of each file and sum it up and update into the audit table.

Please let me know how can i implement the above scenario.