You don't need to change anything in the operating system.
You DO need to get the job that populates the hashed file working properly. It's THERE that the mapping must be set properly. What is the source for this job (file or table)?
Has your UNIX administrator enabled large file support, or is the UNIX file size limit 2GB? Also, please check the ulimit settings for the user that actually executes the job.
Following is the code i am using... If IsNull(DSLink3.FIRSTNAME) Then ',' Else Trim(DSLink3.FIRSTNAME) : ',' : If IsNull(DSLink3.LASTNAME) Then ',' Else Trim(DSLink3.LASTNAME) And i am not getting any warrning or error.Job is completing successfully. I am handling null also. Only two columns are me...
If you can't find the file and can't view the data, what precisely do you mean by "finished successfully"? What are the row counts reported by the client tools?
Welcome aboard. It's certainly possible in version 8.0.1, where there is a separate stage type for Federation Server (though I suspect, from its properties, that it uses an ODBC driver, or at least ODBC protocols, under the covers). I am not currently in a position to verify or otherwise whether thi...
whenever you are reading from file or writing to file you need to user the two environemnt variables called APT_GRID_SEQFILE_HOST and APT_GRID_SEQFILE_HOST2 which runs on conductor node, so use APT_GRID_SEQFILE_HOST2 on the file name before the file path name. That's only true if executing in a gri...