Search found 664 matches

by Teej
Mon Jun 06, 2016 2:54 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to insert into date
Replies: 10
Views: 5535

Correct. Dots, dashes, slashes, whitespace -- all get interpreted as "Something is there. Ignore it." Of course, there are certain reserved characters you can't use (i.e. %.) There may be some certain restrictions that can be imposed, but I will have to review the codebase again to make ab...
by Teej
Mon Jun 06, 2016 12:54 pm
Forum: General
Topic: Installation of IBM Datastage 11.3 failed
Replies: 11
Views: 8014

So you entered the following command:

cd `cat /.dshome`

and does not find dsenv, or ds.setenv?

And this is an UNIX-based platform, not Windows, correct?
by Teej
Mon Jun 06, 2016 12:51 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to insert into date
Replies: 10
Views: 5535

Use %yy rather than %yyyy and you can't use the dashes in the format mask since they don't exist in your source string. Actually, that is not true. In Parallel Engine, we do not pay attention to the actual character in the separator itself. We just assume there's a separator there of some format. S...
by Teej
Mon Jun 06, 2016 12:48 pm
Forum: IBM QualityStage
Topic: Regarding error message in AVI
Replies: 3
Views: 6815

Please verify your environment settings, ensuring that $PATH and $LD_LIBRARY_PATH (or $LIBPATH) is pointing to where lqtcr*.so library file is.
by Teej
Mon Jun 06, 2016 12:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ORA-12541: TNS:no listener
Replies: 4
Views: 5874

It is indeed critical to define your Oracle path after your DataStage paths, because there are some files that Oracle uses that have the same name as DataStage's critical files.
by Teej
Mon Jun 06, 2016 12:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error reading input field
Replies: 2
Views: 2899

There are some situations where you need to ensure that the output column definitions precisely match the order of the SQL statement. You did define the output columns, right? RCP can not work without a source -- and you need to define that source.
by Teej
Mon Jun 06, 2016 12:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Logic Needed
Replies: 7
Views: 4183

Or just use a lookup against a row generator with list of reference data. Nice thing about Row Generator, you can use parameters in some cases to achieve what you want. Also could use a sequential file, if you can manage that (some customers do not like the idea of a random text file in the producti...
by Teej
Mon Jun 06, 2016 12:39 pm
Forum: General
Topic: failed to open project after running successfully in a loop
Replies: 2
Views: 3303

This sounds like a resource issue with the project folder. There are some tuning options starting on the universe level and going down to the system level (including the filesystem). AIX is very fickle in term of tuning options -- basically, they give an administrator enough rope to hang themselves ...
by Teej
Mon Jun 06, 2016 12:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: A generic ETL job requirement
Replies: 6
Views: 5219

Just use the message handler if you do not want to see the message. This is particularly frustrating since Netezza Connector have some dependencies on the design time's osh code having the details for those columns (take a peek at the osh code sometimes, huge mess of XML data in there for Connectors...
by Teej
Mon Jun 06, 2016 12:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage 8.7 compatibility with Hive 0.14 (Hortonworks)
Replies: 7
Views: 5603

Actually no. BDFS was in Beta form for 8.7, and any customers using 8.7 with BDFS must upgrade to 9.1 in order to get full IBM support.
by Teej
Thu Jun 02, 2016 11:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: A generic ETL job requirement
Replies: 6
Views: 5219

The Sequential File stage can do basic validation (there are certain settings you may need to enable to be stricter on data type - such as APT_IMPORT_ENFORCE_BOUNDED_LENGTH and APT_IMPORT_REJECT_INVALID_CHARS. Details: http://www.ibm.com/support/knowledgecenter/en/SSZJPZ_11.5.0/com.ibm.swg.im.iis.ds...
by Teej
Thu Jun 02, 2016 11:13 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: RCP error
Replies: 3
Views: 2687

That sounds like a defect to me. I can not think of any reason why the Oracle Connector can not interpret No as string[]. Maybe it's a limitation for the Oracle Connector in creating a table?

Does it work when you are inserting into an existing table?

-T.J.
by Teej
Thu Jun 02, 2016 11:10 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Datastage 8.7 compatibility with Hive 0.14 (Hortonworks)
Replies: 7
Views: 5603

Why 8.7? Why not upgrade to 11.5.0.1, which have a lot of additions for the hadoop environment, including the ability to run PX jobs using Hadoop's YARN? There is also a number of additional support for Hive in 11.5: http://www.ibm.com/support/knowledgecenter/en/SSZJPZ_11.5.0/com.ibm.swg.im.iis.data...
by Teej
Thu Jun 02, 2016 11:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Connector stage for stored procedures execution
Replies: 15
Views: 11218

Please have your administrator come here and see our responses. We have many decades of experience using DataStage (and I am one of the actual developers of DataStage Parallel Engine). If we are saying the Stored Procedure stage is actively supported, by darn, maybe we are right. True, it would be n...
by Teej
Thu Jun 02, 2016 10:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Runtime Column Propagation memory usage
Replies: 4
Views: 3024

Haha. Okay, fine -- here's my answer: Regarding RCP question - RCP only allow for columns to be migrated without being explicitly defined. It does not automatically increase memory usage (and the increase in memory varies depending on the number of columns being migrated.) It is actually the default...