Search found 15603 matches

by ArndW
Mon Aug 20, 2007 12:14 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Strange Datastage job behavior!
Replies: 12
Views: 3721

splayer - check again, if you don't find a &PH& then perhaps you (or someone else) deleted the directory. Do you have a D_&PH& entry in the project?

You need to execute "cd \&COMO\&' at the command line, as UNIX sees the ampersand as a special character.
by ArndW
Sun Aug 19, 2007 11:39 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Max no of jobs that can run in parallel ?
Replies: 1
Views: 681

The maximum number of concurrent jobs that can run is dependant upon so many different variable factors that it is impossible to give a number. It can range from very few to several hundred or more depending upon your configuration. There is no builtin mechanism to effectively form a queue for jobs,...
by ArndW
Sun Aug 19, 2007 11:16 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Passing max number of warnings to an individual job in a seq
Replies: 2
Views: 791

Unfortunately the maximum number of warnings a job will use is not part of the JOB activity stage. You can set that value you start the job via command line dsjob program, though.
by ArndW
Thu Aug 16, 2007 5:37 pm
Forum: General
Topic: Delete Job Difference
Replies: 7
Views: 1980

Using the TCL commands directly to delete entries in the DataStage hashed files that contain job metadata is a prelude to finding out if your project backups have worked. You will corrupt jobs and/or the whole project by deleting things from the DS_JOBS (or DS_JOBOBJECTS). You should go and edit the...
by ArndW
Thu Aug 16, 2007 3:51 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Data process in the HL7 v2.x and EDI format (Yes or No).
Replies: 2
Views: 1392

I was curious about the v2 of HL7 so I used "HL7 DataStage" in google and got a lot of hits, several of which on IBM pages.
by ArndW
Thu Aug 16, 2007 3:47 pm
Forum: General
Topic: Delete Job Difference
Replies: 7
Views: 1980

The "\\NextNumber" entry in the DS_JOBS table holds the next job number. Hashed file records can have any format, and it happens that field 1 is used to store the next number in that record, but it is also the column used for JOBNO (in the actual job records), which is why this query returned this r...
by ArndW
Thu Aug 16, 2007 3:43 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to handle French Accent characters with CFF component?
Replies: 7
Views: 9237

Another question, how many binary columns do you have in your file? If just a few, you can let FTP take care of the EBCDIC-ASCII conversion and then reconvert those few COMP columns to EBCDIC and then decode them using the sdk routines into their actual number values.
by ArndW
Thu Aug 16, 2007 3:33 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hash Files
Replies: 3
Views: 1103

There is no tuning information stored anywhere. The current hashed file type, modulo and other attributes are stored in the physical file itself in binary form. The DICTionary only describes the format of the data in the file, not the file's format. You can get a hashed file's information display us...
by ArndW
Thu Aug 16, 2007 3:30 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Partitioning Method in Sort Stage
Replies: 7
Views: 1667

It will hash the 5 values to 2 nodes, one node will get 2/5 of the data, the other 3/5 of the data in your example. You can't get a better distribution unless you use round-robin, but then you would need to repartition again downstream for the sort so that approach is no good.
by ArndW
Thu Aug 16, 2007 3:25 pm
Forum: General
Topic: SFTP
Replies: 1
Views: 944

The same as for normal FTP. You can only push the file from UNIX to Windows if you have an sFTP server running on Windows, otherwise you will have to pull the file from the Windows side.
by ArndW
Thu Aug 16, 2007 3:24 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Remove duplicate in server jobs
Replies: 4
Views: 1241

I agree that deduplicate via hashed files is not the most efficient approach. Sort the incoming data using the fields you need for detecting duplication, then use 2 stage variables in a transform stage (one to get the result of comparing this record with the last, the other to store the last record)
by ArndW
Thu Aug 16, 2007 3:17 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Sybase Stage error
Replies: 12
Views: 2518

...and As Arnd suggested added some records to the current table and the issue is remaining same... The result you get after adding records is important in diagnosing your error - 'same' doesn't help at all. Do you have just 10,000 records now (implying a limit is set somewhere), 10,001 (implying a...
by ArndW
Thu Aug 16, 2007 3:14 pm
Forum: General
Topic: Delete Job Difference
Replies: 7
Views: 1980

The TCL DELETE command is not documented and, as you noted, does not really delete a job but deletes only the entry in the DS_JOBS hashed file. I don't know where you got this from but perhaps it came from some post here in DSXchange and is almost certainly taken out of context. This is one reason t...
by ArndW
Wed Aug 15, 2007 11:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to batch import job using DS Manager
Replies: 1
Views: 586

You can have multiple jobs in one .dsx export file. One instance of the manager can only import one file at a time. But you can run multiple instances of the Manager and each can be importing a .dsx file.