Search found 4992 matches

by kcbland
Mon Apr 12, 2004 12:22 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: SQL exec before/after routine to truncate SQL Server table
Replies: 4
Views: 1104

From a private email: So what stage do I use for loading the Oracle table to a file in the first job? I understand I use BCP to go from the file to a SQL Server table in second job. Just use the OCI stage and send your output to a sequential stage. In fact, you should see that the data spools much f...
by kcbland
Mon Apr 12, 2004 12:06 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clear a logfile from a command line?
Replies: 8
Views: 3659

If you currently have a Director open trying to view the log, you can't delete the job. You should kill your Director and get out of the job log. Now, to manually clear the log file, you have several choices and can search the forum for more detail. You can use DS Administrator to execute a DS comma...
by kcbland
Mon Apr 12, 2004 11:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clear a logfile from a command line?
Replies: 8
Views: 3659

There is a last-ditch method of manually whacking the log file. It involves figuring out the jobs internal number, then executing either a DS CLEAR.FILE statement, a DELETE FROM SQL statement, or a DOS erase command to hard-kill the log contents. You can search the forum, this has been recently cove...
by kcbland
Mon Apr 12, 2004 11:07 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help needed to kill a "rogue" server job
Replies: 5
Views: 1581

Try recompiling the job, if you rebooted there can't be any runaway threads. You have a "software" lock, basically just a status in a table is invalid. Recompile and that should fix it.
by kcbland
Mon Apr 12, 2004 11:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clear a logfile from a command line?
Replies: 8
Views: 3659

Open your Director, but do not go to the log view. From the main job listing screen, simply clear the log from there. You avoid having to view the log, which is impossible when you have a runaway job generating zillions of error messages. I'm assuming you ran the job with unlimited warning messages.
by kcbland
Mon Apr 12, 2004 10:14 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Schedule of Job Sequencer from ControlM
Replies: 4
Views: 1266

Batch jobs will NEVER be unsupported. I don't know where you got that idea, but vast amount of API's created/documented requires the ability to use them. Writing your own job control is an important feature.
by kcbland
Mon Apr 12, 2004 10:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Slow loading
Replies: 3
Views: 890

What is your commit setting?
by kcbland
Mon Apr 12, 2004 9:05 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Schedule of Job Sequencer from ControlM
Replies: 4
Views: 1266

There's no difference in what type of job is started from the command line program dsjob. The argument that a scheduler has to run all tasks in the jobstream is invalid. Jobs are part of a jobstream, and the scheduler is best utilized when all it is responsible for doing is initiating the jobstream....
by kcbland
Mon Apr 12, 2004 7:34 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: SQL exec before/after routine to truncate SQL Server table
Replies: 4
Views: 1104

Your job is probably slow because you run at the speed of the slowest link in your process. If your job design is not real-time/near real-time then you should break your job into two pieces. The first job should extract the data and write it to a file, the second job should read that file and load t...
by kcbland
Mon Apr 12, 2004 7:30 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Problems setting large record size in hash files
Replies: 6
Views: 2817

This post may help clarify somethings about hash files: m The same hash file can have different performance results day-to-day because of fluctuations in the volume put into the hash. This is because of the way overflow is handled by the dynamic nature of the hash. If the hash file is presized, that...
by kcbland
Mon Apr 12, 2004 7:17 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Run Datastage Server job from Command Line
Replies: 2
Views: 908

Read your manual or search this forum for the command line job interface, known by the executable name of "dsjob". Someone just asked about using a 3rd party scheduler, it's all handled the same way thru this executable.
by kcbland
Mon Apr 12, 2004 5:40 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to implement the SCD type 3 in Datastage 7.0 ...
Replies: 3
Views: 1761

1. Read the source row. 2. Transform the souce row. 3. Reference the row in the target table (either prestage to hash or do OCI lookup) 4. Update all non-SCD type 3 columns if they are different. 5. Compare SCD type 3 columns against their current value. If different, then move version 3 value to ve...
by kcbland
Sun Apr 11, 2004 9:12 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: global parameters
Replies: 2
Views: 726

From a private message: Hi, If we declare Job properties from Edit menu are local to particular job,if we want to declare global(accessible to all jobs) parameters,how to do it? You use job control to obtain the appropriate values, and then whenever jobs are run you set the value accordingly. This c...
by kcbland
Sun Apr 11, 2004 7:27 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: global parameters
Replies: 2
Views: 726

Job parameters are accessible from the Job Properties dialog box, invoked either from the Edit menu or from the yellow button on your toolbox that says Job Properties.