Not to ask a stupid question, but are you sure you're telnet'd to the right server? If you are, then look to see if DS is installed. "ps -ef |grep dsrpc" should show the daemon running. If you don't see it, you're probably on the wrong server. If you do see it, then the .dshome file shows where the ...
Ray, do you know if it is possible to hand-write an OSH script as if a GUI didn't exist? Existing Orchestrate customers should have some backwards compatibility as of release 7.5, I hope.
Write a DS Basic Routine to do what you need and call it as an after-job routine. Since Routines are difficult to develop (no Test button) I often use a Batch job to get the logic written and tested. Then, paste it into a Routine and adjust. Here's some code off the top of my head: InputArg is the t...
You can always use an after-job/transformer routine build the custom filename and then all the DSExecute API to rename a file once built. You could consider the Macro which gives you a datetime and add that as part of the filename (not exactly what you asked), kind of like a parameter. You could wri...
Welcome aboard. Please post in the correct forum. By doing so, you will be prompted to enter in information necessary for us to answer your questions. What version of DataStage? What release of DataStage? Are you attempting to use the dsjob command line program for starting jobs? Can you please post...
Jobs that don't have their auto-purge set have to be manually set, or you can use a utility program to mass-apply the purge settings such as one I supply on my website.
The "theory" is that all you need to do is "adjust" the DRS stage to Oracle via the drop down on the stage. But the "practice" is that you have to re-write any custom SQL into Oracle syntax. You'll also seriously consider re-importing all metadata using Oracle's plugin so that the correct data types...
Welcome aboard. You could pivot, or you could sort and use stage variables to collapse the multiple rows into their appropriate columns and pass thru an aggregator to output the last version of each row.
You can not insert NULL value into a key column of a Hashed File, this error is pointing to the 880th record, check whether this row from source is containing any NULL in the key column or not. Just a slight correction, it's not the 880th record, the message: Write failed for record id ' 880'". mea...
Also, I would like to know how I can capture the last row of the fixed width file( the trailer line) in a seperate file Since you're reading a multi-record type file, your Sequential stage should read the entire row as a single column. Have a Transformer stage split the output into three links. Sen...