Search found 7201 matches
- Fri May 17, 2002 1:05 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Global Variable
- Replies: 14
- Views: 6628
DataStage manuals refer to the "COMMON" statement. in there, you can declare global variables that can be used during the execution of your DS Job. Good Luck. -----Original Message----- From: m_musumeci@inwind.it [mailto:m_musumeci@inwind.it] Sent: Friday, May 17, 2002 9:01 AM To: datastage-users@ol...
- Fri May 17, 2002 1:01 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Global Variable
- Replies: 14
- Views: 6628
Global Variable
Hello, I would like to create and set a variable in a Stage and then use the value of this variable in another stage. So I would like to create a job "global" variable. Have you got some hints ? And may I create a global variable in a Job and use the value in another job ? Thank you for interest. Ma...
- Fri May 17, 2002 10:20 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: 5.2R1 Complex Flat File
- Replies: 1
- Views: 1246
Doreen, yes, weve run into the same situation at a german customer site (NT based). Weve reported this to our support organaisazion an are awaiting solution. Please report to your ascential support team!! Best regards Klaus -----Original Message----- From: Doreen N Muller [mailto:Doreen.Muller@ipape...
- Fri May 17, 2002 3:18 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Conditional use of Hash Files?
- Replies: 4
- Views: 988
The ability to do conditional lookups in DataStage will be available in V6 of DataStage (currently in beta) by using a partition stage to partition the process/data flow based on the lookup critera and later rejoining the process/data flow with a collector stage. Mike. -----Original Message----- Fro...
- Fri May 17, 2002 2:50 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Conditional use of Hash Files?
- Replies: 4
- Views: 988
Patricia, what you really need to do is seed the DataStage job with the max surrogate value in the table. The job can then use that seed value and increment that value each time it needs to assign another surrogate. The problem you will encounter is that if you want multiple instances of that job ru...
- Thu May 16, 2002 11:01 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: ODBC Update Question
- Replies: 3
- Views: 824
No, it does work as documented. However, its slow (have to fail the one before trying the other), and SQL Server tries to be too helpful in generating error conditions/messages that DataStage detects and causes the abort behaviour you described. It works fine into other databases. "Bidondo, John" cc...
- Thu May 16, 2002 10:44 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: ODBC Update Question
- Replies: 3
- Views: 824
Thanks Raymond I assume the reference lookup be to identify records already present in the table in order to distinguish between an Insert and an Update? Should I assume the Update Action of "Update Existing or Insert New" simply doesnt work as documented? -----Original Message----- From: Raymond Wu...
- Thu May 16, 2002 10:23 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Do a insert and updtade, to the same record
- Replies: 0
- Views: 387
What you are proposing is certainly possible. However a better approach would be to have two links into the target table, one to do inserts (for the first row returned from the reference lookup), the other to do the subsequent updates. The secret to getting it right is to be 100% certain of the logi...
- Thu May 16, 2002 10:21 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: ODBC Update Question
- Replies: 3
- Views: 824
The best (perfomance) solution is a model with two links to the database, one to perform inserts the other to perform updates. For this to work you also need to perform a reference lookup on the same table, or on a hashed file copy of it. If the latter, the hashed file must be maintained by the job ...
- Thu May 16, 2002 10:08 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Merging Input Files
- Replies: 4
- Views: 610
Why not pre-process (perhaps via a before-job or before-stage call to DSExecute) using a command such as cat file1 file2 > file3 Then DataStage can process file3 -----Original Message----- From: Regu.Saliah@astrazeneca.com [mailto:Regu.Saliah@astrazeneca.com] Sent: Thursday, May 16, 2002 10:07 AM To...
- Thu May 16, 2002 5:32 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Merging Input Files
- Replies: 4
- Views: 610
you can output the results of the first source to a text file and then APPEND the other source files 1 at a time to the text file. I do this with 3 source files and build a 1.3 million row text file in 6 minutes. we have 3 other jobs loading similar amounts of rows as well. then we BCP these into ou...
- Thu May 16, 2002 5:25 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: ODBC Update Question
- Replies: 3
- Views: 824
ODBC Update Question
Hi I have a job that selects data from an Oracle table via an ODBC stage, passes the data through a Transform stage, then tries to insert new rows or update existing rows in an identical table in MS SQL Server via another ODBC stage. In the Update Action for the SQL Server ODBC stage Ive tried Inser...
- Thu May 16, 2002 4:12 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Merging Input Files
- Replies: 4
- Views: 610
If you only append one file after another, why you need hash file? just read the source file and write to the target file with update action Appending to existing file. We have experienced to join files using combined key which we have to rearrange the muiti-fields on top first, when joins, it run a...
- Thu May 16, 2002 3:36 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Merging Input Files
- Replies: 4
- Views: 610
Regu asked: > Guys, > > Ive got a couple of source files that I want to concatenate together > (i.e. append one to the bottom of the other) and process them as one. > Ive tried > doing this by loading them into a Hashed File but the key fields are not > next to each other or at the top, so to over c...
- Thu May 16, 2002 3:20 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Conditional use of Hash Files?
- Replies: 4
- Views: 988
We went with internal DataStage counters to generate surrogate keys, rather than use a sequence as it performs better and makes life a lot simpler (i.e. it solves the conditional problem you are having). If you can figure it out, I would imagine there is a way to call the database to get a sequence ...