Hi all
Is it possible to connect to SQL Server with Datastage?
We are running datastage 5.2r1 on Solaris 8.
Thanks
Denzil
Search found 7201 matches
- Wed Jul 17, 2002 11:25 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Connecting to SQL Server
- Replies: 2
- Views: 415
- Wed Jul 17, 2002 11:09 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: A quick and easy question! Reply
- Replies: 3
- Views: 701
A quick and easy question! Reply
This is a topic for an orphaned message.
- Wed Jul 17, 2002 11:09 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: A quick and easy question! Reply
- Replies: 3
- Views: 701
But that would stop the whole row going in? Not just the unwanted null field value? Vicki "Raymond Wurlod" To: datastage-users 16-Jul-2002 cc: 23:28 Subject: Re: A quick and easy question! Please respond to datastage-users@ oliver.com All you need is a constraint on the output from the Transformer s...
- Wed Jul 17, 2002 10:07 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
Hi, I think you can follow different ways. 1. Hash file (probably static and 64BIT as Ken said). You can improve writing performances "playing" with some write cache environment variables (in the dsenv file) such as DS_MMAPSIZE, DS_MMAPPATH and DS_MAXWRITEHEAP if you are on UNIX box. Also there is a...
- Wed Jul 17, 2002 7:55 am
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
Hi Karthik, Is the Oracle Database Source running on the same Datastage Box? And the Target? Do you have the sensibility if the bottle neck is reading or writing data? Is really strange that just for read and load data to another Oracle DB you need 36 Hours. Or you are applying heavy transformation ...
- Tue Jul 16, 2002 10:55 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: ADP, PeopleSoft or SAP stage
- Replies: 1
- Views: 446
Yes, such stage types exist. Due to licensing issues, they are chargeable. A search on the Ascential web site will prove fruitful. m and use the Search capability. Raymond Wurlod Trainer, Asia-Pacific Region IBM Informix Education Department Level 22, 60 City Road, Southbank Vic 3006 Australia Telep...
- Tue Jul 16, 2002 10:49 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: ADP, PeopleSoft or SAP stage
- Replies: 1
- Views: 446
ADP, PeopleSoft or SAP stage
Hi All!
Does someone know if DataStage has stages for accessing HR/Payroll systems such as PeopleSoft, ADP or SAP? If so, is there any documentation floating out there...?
Just curiosity...
Thanks
Christian
Does someone know if DataStage has stages for accessing HR/Payroll systems such as PeopleSoft, ADP or SAP? If so, is there any documentation floating out there...?
Just curiosity...
Thanks
Christian
- Tue Jul 16, 2002 10:34 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
Given the remainder ot the thread, I discern that all you want is a staging area. In this case I would recommend a sequential file. Theres no need to carry the overheads of a hashed file for what youre trying to do. Since Ken asked, the syntax for creating a hashed file is the standard syntax with t...
- Tue Jul 16, 2002 10:28 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: A quick and easy question!
- Replies: 6
- Views: 874
All you need is a constraint on the output from the Transformer stage that is feeding the table. The constraint expression has the form Not(IsNull(column)). Raymond Wurlod Trainer, Asia-Pacific Region IBM Informix Education Department Level 22, 60 City Road, Southbank Vic 3006 Australia Telephone: +...
- Tue Jul 16, 2002 10:23 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
Karthik, To better the performance, one needs to do enhancements at the bottleneck... If you are just reading from a source and then writing to a target, the performance will be slow if either source is throwing out records slowly or target is inserting records slowly .. hash file maynot be the answ...
- Tue Jul 16, 2002 10:21 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
No there are no aggregations. It is a simple read from oracle table and insert into target oracle table. But the job runs more than 36 hours for 15 million records. Thats the reason why we are taking the hash approach. Please give some tips. Thanks Karthik -----Original Message----- From: Galemmo,Ni...
- Tue Jul 16, 2002 10:15 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
Why are you writing to a hash file? Is it to eliminate duplicates? Do aggregations? What is the performance reason for doing this? Why not write to a sequential file stage... or better yet, the Oracle bulk load stage. -----Original Message----- From: Kandamuri, Karthik [mailto:Karthik.Kandamuri@awi....
- Tue Jul 16, 2002 10:09 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
Ken: Thanks very much for the suggestion. But my scenario is as follows: 1. I am not creating the hash file for look up ! 2. For performance I am converting an Oracle table into a hash file. 3. Read hash file as source and insert all records into target Oracle File. 4. Job has a problem when I am cr...
- Tue Jul 16, 2002 9:51 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Hash File Creation
- Replies: 21
- Views: 4025
I could almost guarantee that you have exceeded the 2.2 gigabyte size limitation on a 32-bit hash file. DataStage uses 32-bit hash files as its default when creating the hash files. Chances are, if you looked at the hash file in unix, you would see that the file is 2.2 gigs. 15,500,000 rows * 150 by...
- Tue Jul 16, 2002 6:08 pm
- Forum: Archive of DataStage Users@Oliver.com
- Topic: Unpredictable Errors in Datastage version 4.0
- Replies: 1
- Views: 370
Hi Dinesh, I havent seen this issue for quite a while, but it has happened in the past. Something has been corrupted in the internal workings of the job. Per tech support there is no way to determine where the problem lies. If youre lucky, copying the job works. If youre not lucky you have to re-bui...