Search found 62 matches
- Tue Sep 04, 2012 3:13 pm
- Forum: General
- Topic: How to tell in which job a column gets loaded
- Replies: 3
- Views: 1725
- Tue Sep 04, 2012 3:08 pm
- Forum: General
- Topic: How to tell in which job a column gets loaded
- Replies: 3
- Views: 1725
You have the 'Where Used' tool in the Manager that can help with that provided you have managed your metadata properly. However, I've also gone the route you mentioned but I took advantage of some tools over at Chuck Smith's website to make that easier. After parsing a dsx file, I loaded the output...
- Fri May 25, 2012 10:17 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Lookup of the hashed file fails
- Replies: 1
- Views: 1438
Hi Craig, I appreciate your assistance. I checked the data in the key columns again and noticed that even though visually the key values between the driver table and lookup file (hashed file) were equal, but the values in the driver table were made of only 6 characters, and the same values in the ha...
- Thu Apr 26, 2012 8:49 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: New row in source is not loaded into target table
- Replies: 3
- Views: 3224
Yes, always better because NOTFOUND doesn't really work for non-hashed file based lookups. :wink: Hard to tell, though - is your lookup to a hashed file or directly to a database table? Craig, Thank you as always. To answer your quesiton, I was originally using NOTFOUND on ODBC stage. After adding ...
- Thu Apr 26, 2012 8:45 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: New row in source is not loaded into target table
- Replies: 3
- Views: 3224
NOTFOUND works fine when performing lookup on Hashed Files but for the Table lookup it is always better to use the null check. sAM Hi Sam, Thank you for your input. You are correct. NOTFOUND would not work with database tables. To resolve this issue, my co-worker and I added an intermediate Hashed ...
- Thu Feb 16, 2012 12:34 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Source Data by Key
- Replies: 4
- Views: 1949
Hi Craig,
Using your suggestion, I was able to add the following 'where' clause to the source OCI stage and modify the date parameter [RUN_DAY] by defining it as string and setting it as '11/15/2011'.
Thank you very much,
Seyed
Using your suggestion, I was able to add the following 'where' clause to the source OCI stage and modify the date parameter [RUN_DAY] by defining it as string and setting it as '11/15/2011'.
Code: Select all
WHERE TO_CHAR(WEB_RAWDATA.LOG_DT_TM, 'MM/DD/YYYY') = #RUN_DAY#
Seyed
- Wed Feb 15, 2012 3:29 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Source Data by Key
- Replies: 4
- Views: 1949
- Wed Feb 15, 2012 8:52 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Source Data by Key
- Replies: 4
- Views: 1949
Reading Source Data by Key
Hi All, I have a Server job that reads data from one Oracle table and loads another Oracle table. To do that I am using two Oracle OCI stages. I want to extract 11/15/2011 data from the source table. I put a constraint to achieve data. The problem is that when I run this job, the 11/14/2011 data is ...
- Wed Jan 11, 2012 7:13 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Adding table partition from DS server job
- Replies: 4
- Views: 2222
- Mon Jan 09, 2012 3:40 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Adding table partition from DS server job
- Replies: 4
- Views: 2222
Adding table partition from DS server job
Hi all, We have a number of Server jobs that load data into Oracle tables. I plan on partitioning these tables by month so that partitions older than 3 months could be archived and then dropped. Does anyone know of ways for adding partitions to Oracle tables from DataStage server jobs? At the moment...
- Wed Nov 02, 2011 8:31 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Mainframe flat file to populate Oracle table
- Replies: 7
- Views: 2924
- Mon Oct 24, 2011 1:29 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Mainframe flat file to populate Oracle table
- Replies: 7
- Views: 2924
- Mon Oct 24, 2011 1:15 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Mainframe flat file to populate Oracle table
- Replies: 7
- Views: 2924
Hi Craig, Thank you for your input. I tried using Complex Flat File, but I don't see that as an option. Therefore, I tried using a FTP stage. Now, I am getting the following error message: Load_Wrkr_Loc_test..FTP_Plug_in_0: Connection to the remote host's FTP server failed, socket call 'connect' fai...
- Mon Oct 24, 2011 8:11 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Reading Mainframe flat file to populate Oracle table
- Replies: 7
- Views: 2924
Reading Mainframe flat file to populate Oracle table
Hi all,
I have a requirement to build a server job to pull columns from a flat file on the Mainframe and then use them to populate an Oracle table. What type of stage should I use for my Mainframe flat file?
Thank you,
Seyed
I have a requirement to build a server job to pull columns from a flat file on the Mainframe and then use them to populate an Oracle table. What type of stage should I use for my Mainframe flat file?
Thank you,
Seyed
- Thu Aug 04, 2011 9:00 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Extracting Long Month Name from Date
- Replies: 24
- Views: 14021