Search found 61 matches

by rsaliah
Mon Feb 16, 2009 8:14 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ORA-00911 error
Replies: 17
Views: 6969

Checkout the third column (ECOUPON_ID). It looks as if it start's life as a varchar because your sample data shows double-quotes, and ends up in a number column. If this is the case make sure the quotes are taken care of and there is nothing other then a number being pushed to Oracle. Failing that t...
by rsaliah
Mon Feb 16, 2009 7:14 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ORA-00911 error
Replies: 17
Views: 6969

OK - What type of Stage is the source and are you using OCI or ODBC to do the updates. Which is the key column(s) and which are the updates. Out of interest what is the SQL type definition for column 3 on the Stage used for update?
by rsaliah
Mon Feb 16, 2009 6:48 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ORA-00911 error
Replies: 17
Views: 6969

Looks like Oracle's complaining about invalid characters. Have you checked and validated your data against the Stage column definitions? I'm guessing because you haven't said but your data is read from a sequential file stage delimited by pipe. Have you specified that columns 3, 7 and 9 can be enclo...
by rsaliah
Tue Oct 30, 2007 9:48 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: how to convert a single row into multiple rows
Replies: 8
Views: 8506

Dude, You can pass your two columns through a Transform stage to create a multivalued column of the address field. This you can do by replacing the comma ',' with a @FM ( change(Address,',',@FM) ). The output of this should then go to a hash file. When you output the hash file just define the addess...
by rsaliah
Tue Oct 30, 2007 9:13 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: UniData on Redhat
Replies: 1
Views: 1165

guess there's not that many Redhat installations pulling data from UniData out there. :(
by rsaliah
Tue Oct 30, 2007 9:03 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How do i extract only data that is inside the commas
Replies: 3
Views: 1828

Re: How do i extract only data that is inside the commas

You can use a Sequential file stage and define the field separator as ',' or if you like can parse the data through a transform stage and use the function FIELD.
by rsaliah
Thu Oct 25, 2007 10:13 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: UniData on Redhat
Replies: 1
Views: 1165

UniData on Redhat

Guys, I'm working on migrating some DS projects from a SUN box to Redhat and have been having some trouble configuring access to some UniData servers for jobs using the UniData stage. Having messed around for a little while shipping ud52 files across from the SUN box with no joy I noticed in the ins...
by rsaliah
Tue Feb 06, 2007 11:32 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Performance Problem
Replies: 10
Views: 4612

Sorry guys I haven't coughed up for the full-on membership so couldn't see all the suggestion but thanks very much for your input. For your information the file system on the prod box was setup with Oracle in mind with "forcedirectio" option set. This is what was causing the dramatic slowd...
by rsaliah
Mon Feb 05, 2007 11:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Performance Problem
Replies: 10
Views: 4612

This is really starting to confuse me. I've been suspecting the differences in the file system setup as being the cause so I've simplified my test to try and narrow it down and provide some proof. I now have two simple jobs. 1. Transform generating 9 columns of data writing directly to a hash file. ...
by rsaliah
Mon Feb 05, 2007 8:46 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Performance Problem
Replies: 10
Views: 4612

Thanks Guys, 'lsfs' is an AIX command which I believe on Solaris is 'cat /etc/vfstab'. Have looked at the output of this there are some differences. For example on the slow box we have logging enabled and the file system type is 'ufs' on the other box the type is 'vxfs'. I've no idea what these mean...
by rsaliah
Mon Feb 05, 2007 5:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Performance Problem
Replies: 10
Views: 4612

Thanks for the swift reply. I've had it confirmed that for the duration of the tests there is no other processes running on either box. With regards to the file systems I've been told that they are setup the same, apart from the difference is size there is no other differences, though I wouldn't kno...
by rsaliah
Mon Feb 05, 2007 5:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash File Performance Problem
Replies: 10
Views: 4612

Hash File Performance Problem

Guys, DS version = 7.5.1.A OS = SunOS 5.8 I could use some pointers for things to check. We have a typical environmental setup with a development/test server and a separate production box. The problem I have is that I've notice some difference with performance between the two servers. I'm not a hard...
by rsaliah
Thu Jul 20, 2006 8:26 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Abnormal termination received after stage
Replies: 10
Views: 4542

chulett wrote:So... is it still one job?
Yep :D
by rsaliah
Thu Jul 20, 2006 2:32 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Abnormal termination received after stage
Replies: 10
Views: 4542

Just in case anyones interested... I had no joy running the sql script as a before-stage call either, same problem. I did run it in TCL an it worked so that confirmed that there was no problem with the script. So what I then did was call the sql script as a filter command in the output of a seq file...
by rsaliah
Mon Jul 17, 2006 7:55 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Abnormal termination received after stage
Replies: 10
Views: 4542

Craig you're right it should really be 2 jobs, but I guess I was just being lazy and thinking I could get away with it. The reason being we have a DS process that maintains a summary table of load stat's which contains load start and end time, row counts, etc. But this process was put together at a ...