I think this is a good case for a partial server job solution. Read the input as a single column and call up a BASIC routine like this: EQUATE Comma TO ',' EQUATE LF TO CHAR(10) Ans = InString ;** assing the input value to the output string CommaCount = DCOUNT(Ans,Comma) IF MOD(C...
There is no direct function. If you only need a 1-way method to encrypt a value you can use the CRC32 function pretty effectively; but if you need to decrypt the encoded value back to the original then you need to use another method. Mathematically there are a number of well-known, documented and ea...
Could you be referring to a "Black Hole"? Either way, the term isn't used in DataStage and really doesn't have much meaning unless you explain what you are looking for in a bit more detail.
The scheduling mechanism in the Director is just a front end fo rthe system's scheduler - AT on Windows and CRON on unix.
You can either schedule the job to run each Sunday in the Director (4 mouse clicks in total) or do it directly in CRON
A2love - I hate to have to post this at the end of the thread; but you stated you had an EE job; and ICONV() won't work in one unless you call up a BASIC transform stage.
I wish I could explain exactly what happened during your installation, but I can't; but I have had that problem before and had to do a complete Windoze clean up (removing the install, the registry entries and the disk objects) and subsequent installs went without a hitch.
Do you get additional information in the log entry for "from previous run" when you reset the job? You did state in the first entry that you can't get any jobs to run - is that right? Perhaps you could try a server job that reads a sequential file and outputs to a sequential file (or even the same j...
Since the source data is enclosed in quotes you will need to set the quote to "double" on "edit row" so that it can be parsed and removed on input and specify the datatype as decimal.
there is no easy way to separate out the Parallel and Server job workloads. Your only steering mechanism is via the APT_CONFIG file and how many nodes you choose to use in the PX jobs. Note that PX jobs inherently get more system resources than server jobs.
So the Hashed file lookup isn't your bottleneck. Sounds like your machine is being overloaded when this job is running. Have you or your system administrator checked to see what is causing the machine to slow down? If you are runnin both DB2 and DataStage on the same server I can guess why you are g...
I didn't see your 3rd post until I replied. I'm still a bit confused. If you remove the hashed file lookup from you job and ran it you would see the speed with which you can read from DB2. How many rows a second is that? If you then add the lookup, what rows/second does that change to? Are you pre-l...
DSbox61, I don't see a design difference between your original post and the correction; it seems like you aren't using the hashed file for reference, just loading it. Why are you re-loading the hashed file completely each run? It doesn't look like performance of hashed files is the issue, but that t...