Search found 15603 matches

by ArndW
Fri Sep 23, 2005 8:39 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Data Rounding Problems
Replies: 5
Views: 1205

Patonp,

You won't need to recompile the jobs, the rounding is called/done at runtime.
by ArndW
Fri Sep 23, 2005 5:41 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Difference between datastage 7.5.1 and 7.5.1A versions
Replies: 3
Views: 1251

the README on the CD should go a long way to answering those questions.
by ArndW
Fri Sep 23, 2005 5:40 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: What is Input buffer overrun?
Replies: 3
Views: 2437

The input buffer overrun just means that while parsing this column's value the engine went past the end of string and got an overrun. Is the value "0"or "0."? There are decimal specific attributes that allow all zero values or certain string combinations, plus if I recall there are also APT_ environ...
by ArndW
Fri Sep 23, 2005 5:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: about lookup stage error
Replies: 6
Views: 1878

I think that you cannot have a date field that contains only spaces - what sort of a value is that? It's like having a numeric field that you want to put all 'X' into. The date field should contain a valid date or a null value. I think you would also need to add a TRIM(inputfield) to your formula to...
by ArndW
Thu Sep 22, 2005 8:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: about lookup stage error
Replies: 6
Views: 1878

You can avoid this by only having valid date values in the dataset. Having all *'s means a data formatting error has occurred. You can use a default value in the column description, or check for a valid date - perhaps you want to have a null value in that column instead of a space.
by ArndW
Thu Sep 22, 2005 8:23 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Duplicates/Sorting question
Replies: 17
Views: 5884

Emilio, are you looking to remove duplicates on just the first column, that is what your sample seems to be doing. In that case if your data is sorted then you can do it with a stage variable. The prerequisite of doing this in a normal stage is that the data is sorted by the columns you want to use ...
by ArndW
Thu Sep 22, 2005 3:13 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Datastage and AutoCAD
Replies: 4
Views: 1175

Amos, about 10 years ago I remember trying to parse the format (for a AutoCad to CADDs conversion) and I gave up as it was too complex. But if you export the files in VRML format I think that is quite close to the syntax of XML - it is certainly worth a try since the VRML is a text-based format that...
by ArndW
Thu Sep 22, 2005 1:02 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job import/export
Replies: 8
Views: 1637

No, Ascential does not document any other method for moving objects across projects.
by ArndW
Thu Sep 22, 2005 12:53 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Projects with many jobs have slow graphical front-end resp.
Replies: 7
Views: 1893

Thanks Ray - I did enable file caching for public sharing at this site, but assumed that it applied only to job runs and not to system files. If the functionality is part of the engine then I think we might have a winner after all.
by ArndW
Thu Sep 22, 2005 12:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Binary Zero check
Replies: 2
Views: 825

Another method would be to use a transformer and check for INDEX(In.Column,CHAR(000),1) being zero or nonzero. The ICONV(String,'MCP') will replace nonprintable characters with '.' Since you don't know which column contains these values you could try reading the whole record as jut one column, strip...
by ArndW
Thu Sep 22, 2005 12:40 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to process/debug 32 million records in parallel job
Replies: 1
Views: 761

It would seem that the segmentation violation happens with increased size in your job, although normally it shouldn't. Does it happen at the same input record and player each time - if not then there might be system conditions affecting the run as well. At about which record does this happen (i.e. 1...
by ArndW
Wed Sep 21, 2005 8:27 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Projects with many jobs have slow graphical front-end resp.
Replies: 7
Views: 1893

I plan on firing up a daily job to check for RT_STATUS files without indices. But I am not sure that this is really going to work all that well overall. I think I need to research what other files are used and, as you surmised, not SELECTed with indices. Too bad that we don't have memory files, or t...
by ArndW
Wed Sep 21, 2005 8:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Datastage and AutoCAD
Replies: 4
Views: 1175

AutoCAD files are sequential ones with very specific formats; you would need a specific driver to parse the file contents; but nothing special to just read in the binaries. This is the same things as asking about reading MSMoney or Excel files - they can be read but not necessarily understood.
by ArndW
Wed Sep 21, 2005 8:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Named Pipes
Replies: 4
Views: 1167

I am not sure of Version 6, but at least from 7 onwards you can specify to DataStage that a sequential file is a pipe and you will have the option of specifying read and write timeouts explicitly. Before that was in place, the only signal a reader would get is if the writer closes the pipe; and some...
by ArndW
Tue Sep 20, 2005 1:52 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Projects with many jobs have slow graphical front-end resp.
Replies: 7
Views: 1893

I did a couple of tests today, amongst which I put AK's on all 2000 RT_STATUS file field F1 and that did make a difference, but not enough. I'm doing timing tests and file I/O (converted the dynamic files to static hashed so I could FILE.STATUS them). The system seems to be doing very many file open...