Search found 53125 matches

by ray.wurlod
Wed Sep 03, 2008 5:12 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Usage of UVBackup and UVRestore
Replies: 37
Views: 14475

For some reason it is reading the backslash characters as escape characters. Use forward slashes. Windows doesn't care.

Code: Select all

VERIFY.SQL SCHEMA D:/DATA_DS_Repository/DATA_DEVE 
by ray.wurlod
Wed Sep 03, 2008 3:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: HOw TO REMOVE Read ONLY ACCESS
Replies: 6
Views: 1394

There may be an icon on your desktop.

Otherwise open Internet Explorer (version 6 or higher) and put the name and port number (9080) of Information Server into the address bar.

Code: Select all

http://hostname:9080
by ray.wurlod
Wed Sep 03, 2008 3:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Mainframe file data not viewing correctly
Replies: 6
Views: 1419

It is not a problem.

You have 11 digits to the left of the "." and zero digits to the right of the ".". Therefore, the internal binary representation is of a Decimal(11,0) number.

That's how it works. Live with it. It is a valid representation of the decimal number.
by ray.wurlod
Wed Sep 03, 2008 3:40 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Usage of UVBackup and UVRestore
Replies: 37
Views: 14475

Forget UV.ACCOUNT for now. VERIFY.SQL has told you something you were able to establish by other means - that there is no record for the project in UV_SCHEMA (and therefore none in other system tables). Try specifying the pathname of the project. VERIFY.SQL SCHEMA C:\path\projectdir This is still a ...
by ray.wurlod
Wed Sep 03, 2008 3:38 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Data.30 in my input Data
Replies: 19
Views: 5064

No. The extraneous files in the hashed file directory will continue to prevent its being used as a hashed file. And those files presumably are records that it was intended should be written to the hashed file. Hashed files are not internal to jobs, they are external objects. Of course you could comp...
by ray.wurlod
Wed Sep 03, 2008 2:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: HOw TO REMOVE Read ONLY ACCESS
Replies: 6
Views: 1394

It's completely different in version 8, which is why I told you to use the Web Console for Information Server.
by ray.wurlod
Wed Sep 03, 2008 2:54 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help needed with hashed files
Replies: 3
Views: 1208

Re: Help needed with hashed files

Before I start explaining the issue with the hashed files, I would like somebody to confirm that there is no way to extract a value from a text file and store it into a job parameter within the job itself? Confirmed. Use just a Sequential File stage. Use stage variables in the Transformer stage to ...
by ray.wurlod
Wed Sep 03, 2008 2:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Data.30 in my input Data
Replies: 19
Views: 5064

1. Create a .Type30 file. This needs to be an empty file. echo > .Type30 2. Create a "directory file" in DataStage. CREATE.FILE TempDirX 19 This creates a directory that is a subdirectory in your project directory. 3. Using an operating system MOVE command move all the illegal files from the hashed ...
by ray.wurlod
Wed Sep 03, 2008 1:08 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Data.30 in my input Data
Replies: 19
Views: 5064

SOMEONE has put another file in the hashed file directory (possibly by specifying the hashed file directory in the pathname in a Sequential File stage), or removed the hidden file .Type30 from the hashed file directory. A hashed file directory must contain precisely the three files DATA.30, OVER.30 ...
by ray.wurlod
Tue Sep 02, 2008 11:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reject in XML input stage
Replies: 16
Views: 4290

Premium membership is not expensive, at less than 30c (Rs 12) per day. Premium membership is one of the ways that the hosting and bandwidth costs of DSXchange are defrayed. If Craig, or any of the five premium posters, were to accede to your request, then this would set a precedent that would under...
by ray.wurlod
Tue Sep 02, 2008 11:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to find the files being used in which all jobs?
Replies: 6
Views: 1865

I prefer to use Usage Analysis on the table definition imported from the file. But, then, I am rigorous in preserving the links between the table definitions and the jobs that use them. Are you?
by ray.wurlod
Tue Sep 02, 2008 11:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: wrong number of parameters
Replies: 7
Views: 4198

At one level it's correct because it works. However you have not ascertained why what you tried earlier does not work. Knowing this would perhaps allow you to create more efficient jobs in future.
by ray.wurlod
Tue Sep 02, 2008 11:31 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Server Funnel
Replies: 9
Views: 2594

This can cause problems if one file has more rows than the other. The Link Collector waits for the never-to-arrive next row, and eventually takes a timeout error. Better is to use a filter command in your sequential file stage that creates a stream of all lines from the two files. I prefer TYPE as t...
by ray.wurlod
Tue Sep 02, 2008 10:24 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Time in job compilation
Replies: 1
Views: 646

No idea. Where are the bottlenecks? What (more precisely than "medium complexity") is in the job design? Lots of Build or Transformer stages? These take longer to compile because of the need to create C++ source code and compile and link that in a way that is callable from the main step (job) flow. ...
by ray.wurlod
Tue Sep 02, 2008 10:22 pm
Forum: General
Topic: port used by datastage
Replies: 4
Views: 8311

31538 is used by DataStage server to listen for DataStage clients. You will definitely need to get that one opened.

10000 and 11000 (by default) are used in parallel jobs for inter-process communication.

13400 and 13401 are used by the Java job monitor application.