Search found 4605 matches

by kduke
Sat Oct 16, 2004 8:45 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DS_JOBS corrupted
Replies: 8
Views: 7874

I think you hit 100%. At that point you corrupted your DS_JOBOBECTS hash file. Sometimes it can fix itself but you lost some records. After you run UVFIXFILE then always reindex like above. Some job designs may be lost. Hopefully you have good backups. I would try to compile your jobs. This should t...
by kduke
Fri Oct 15, 2004 8:18 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: how to split data from one field in to two fields
Replies: 3
Views: 1335

The field() function can split it based on any one character separator.

field(Col1, '|', 2)

Will give you the second field using '|' as a separator.
by kduke
Fri Oct 15, 2004 5:43 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DW Modeling question/concept
Replies: 9
Views: 2081

Rollups over time are not possible the way I explained. You need to pick specific time frames like monthly, quarterly and annually. The speed gained by summary tables usually out weighs the rollups on the fly. It is a trade off. It depends on the number of transactions.
by kduke
Fri Oct 15, 2004 2:13 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DW Modeling question/concept
Replies: 9
Views: 2081

Ogmios I think they were wanting to calculate the ranking ahead of time and store in some kind of summary table so the report runs faster. DataStage can do this. You need to think about the date ranges like quarterly ranking or annual or both. In healthcare this is very common to rank doctors by how...
by kduke
Fri Oct 15, 2004 2:03 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequentail File name has MMDDYY
Replies: 6
Views: 1519

I would write to a standard file name and execute a DOS or UNIX command to rename the file as a after job routine. Very simple to do. Several examples of this posted this week.
by kduke
Fri Oct 15, 2004 11:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: COMMON Variables
Replies: 2
Views: 818

A common is valid across one connection to Universe. Where they break connections or really create a new connection within one job is up to Ascential and has changed several times between versions. I would use a hash file or a sequential file to read and write these values. You could do this only in...
by kduke
Thu Oct 14, 2004 12:28 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Control
Replies: 5
Views: 1148

I think the issue maybe that you are attaching to a job in two routines and detaching to the same job in one of the routines or not detaching some where. Either way you can lock up.
by kduke
Thu Oct 14, 2004 8:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Multi Server Query
Replies: 2
Views: 725

We have done this in SQL Server and Oracle. I assume this is not a problem in Sybase. If you create a dblink then you can use it in a user defined query.
by kduke
Wed Oct 13, 2004 3:23 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: copy project from one box to another
Replies: 4
Views: 1269

You probably need to remove the directory before copying contents. You cannot be in Designer when you copy these files. You probably need to reconnect to the project but it should work if the DataStage engine and this one project are in the same directory on both machines.
by kduke
Wed Oct 13, 2004 8:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Aborting Job when a Semaphor is set
Replies: 9
Views: 2337

Peter Hester did something similar to this in is batch automation. It is pretty easy in BASIC using ODBC to get the results you wanted. You have to start all your jobs through this process otherwise you need to check to see if any job is running manually as well. You cannot force DataStage to not ru...
by kduke
Wed Oct 13, 2004 8:43 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequential files in the report format
Replies: 2
Views: 566

You would have to treat the whole row as one column then use constraints to ignore the heading rows. In a second transformer parse the row into columns using substring() or field().
by kduke
Wed Oct 13, 2004 8:38 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Oracle control File(.ctl) for SQLLDR to Column Specification
Replies: 4
Views: 2040

Import all the columns in the target table. Delete the columns you do not need. To get them in the correct order is a problem. There are too many columns that are undefined in the DS_METADATA file to write records into it. I have sorted these columns. I have create DDL from these fields. Ray has eve...
by kduke
Tue Oct 12, 2004 2:51 pm
Forum:
Topic: Saving Table Definitions in DataStage for MetaStage
Replies: 6
Views: 5464

Saving metadata in a job creates a record in DS_METADATA. Nothing more. I am not sure but I think MetaStage reads this from the job directly anyway. If you import a table definition then it is also stored in DS_METADATA. This is the hash file that DataStage stored this kind of information. Each job ...
by kduke
Tue Oct 12, 2004 2:42 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: USING COMMAND FROM DS ADMINISTRATOR
Replies: 1
Views: 464

Not really. This should be a parameter anyway. Is it a parameter? I would have to look it up but I think this is stored in field 9. Most people think you do an export. Then edit the export file and replace the path with the new path or parameter. I would export one job and then fix it in Designer an...
by kduke
Tue Oct 12, 2004 2:33 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Oracle control File(.ctl) for SQLLDR to Column Specification
Replies: 4
Views: 2040

DataStage should create most control files for most bulk loaders. It used to be a 2 step process to generate a ctl file and then use it. It should do all this at run time. It may not work well if you try to generate these on your own and then use it. We had some issues trapping errors when we used t...