Datastage upgrade from 7.5 to 8.5
Moderators: chulett, rschirm, roy
Datastage upgrade from 7.5 to 8.5
Hi,
Currently we are using the datastage version 7.5.1. All the current DS jobs are server jobs . There is a proposal for the DS upgrade to 8.5 . I have few doubts regarding the upgrade
1. Will all the jobs and routines in the current version compatible with the new version?.
2. Before upgrade I will be taking the DS export of all the job designs,executables, routines and table definitions . After upgrade I will be importing this dsx file in the new DS version and test all the jobs once . Is this the only process needs to be done for 'transfering' the DS components across different versions?
Currently we are using the datastage version 7.5.1. All the current DS jobs are server jobs . There is a proposal for the DS upgrade to 8.5 . I have few doubts regarding the upgrade
1. Will all the jobs and routines in the current version compatible with the new version?.
2. Before upgrade I will be taking the DS export of all the job designs,executables, routines and table definitions . After upgrade I will be importing this dsx file in the new DS version and test all the jobs once . Is this the only process needs to be done for 'transfering' the DS components across different versions?
-
- Participant
- Posts: 3
- Joined: Mon Jan 03, 2011 12:09 am
- Location: Chennai
Re: Datastage upgrade from 7.5 to 8.5
1. All the jobs & routines compatible with new version except some of oracle OCI stages. (Oracle OCI 8 stage is not supported in new version,need to replace with OCI 9 stage)
2. If it is server jobs, make sure you have populated required hashed files and check the permission's for routines.
2. If it is server jobs, make sure you have populated required hashed files and check the permission's for routines.
1. Pretty sure that happened before 7.5.1 so probably a non-issue. Short answer, yes - compatible.
2. As noted, exports and backups are always good to have. Known where all your 'extra' bits are - scripts, reference files, etc - and what 'optional' stages you have installed so they are there on the new server.
2. As noted, exports and backups are always good to have. Known where all your 'extra' bits are - scripts, reference files, etc - and what 'optional' stages you have installed so they are there on the new server.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
DS 8.5 in Production
Hi Ramkumar,
Were you able to perform this upgrade in the production environment? What was the OS? 64 bit or 32 bit?
Please can you provide me the infrastructure baseline?
How long has 8.5 been in production and have you experienced production issues?
Animesh
Were you able to perform this upgrade in the production environment? What was the OS? 64 bit or 32 bit?
Please can you provide me the infrastructure baseline?
How long has 8.5 been in production and have you experienced production issues?
Animesh
Hi,
We have installed the datastage 8.5 in the development environment and imported some test jobs from 7.5 to 8.5 . But all these jobs are in 'not compiled' status which were in 'finished' status in 7.5. Is this due to the DS components movement across different version?. If that is the situation I may have to recompile all these jobs in the new verion which is not practical. Please advise.
We have installed the datastage 8.5 in the development environment and imported some test jobs from 7.5 to 8.5 . But all these jobs are in 'not compiled' status which were in 'finished' status in 7.5. Is this due to the DS components movement across different version?. If that is the situation I may have to recompile all these jobs in the new verion which is not practical. Please advise.
Hi,
I am using the ORAOCI9 stage for accessing the oracle database in the DS 7.5. In one of the jobs, I have the extraction query :
SELECT TO_CHAR(sysdate, 'YYYY-MM-DD HH24:MI:SS') FROM dual . This is the auto generated query in the oracle stage when defined the column name as sysdate, data type as timestamp and table name as dual.
When I used the same job in DS 8.5 its showing the error: 'date format not recognized'. Is this due to any set up issue during the DS 8.5 server installation or DS 8.5 will not support this conversion?
I am using the ORAOCI9 stage for accessing the oracle database in the DS 7.5. In one of the jobs, I have the extraction query :
SELECT TO_CHAR(sysdate, 'YYYY-MM-DD HH24:MI:SS') FROM dual . This is the auto generated query in the oracle stage when defined the column name as sysdate, data type as timestamp and table name as dual.
When I used the same job in DS 8.5 its showing the error: 'date format not recognized'. Is this due to any set up issue during the DS 8.5 server installation or DS 8.5 will not support this conversion?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Simply put, you can't. ppgoml's comments are correct on this. DEV source code may be your best solution if QAT or Prod source is unavailable at all. Worst case, source code can be semi-reconstructed from the executables (osh) but only with considerable additional time, effort, skill-level requirements and cost. It's not for the feint-of-heart or inexperienced DS developers.sanoojk wrote:But I don't have the job designs in UAT and Production environments. Only the job executables are there . So how can I compile the jobs in those environments with out any code design information?
Typically, any migration between major versions REQUIRES a recompilation of all jobs due to changes in the engine and environment and to do so you MUST have the source code. The IS 8.5 documentation contains instructions for migrating from previous versions of DataStage and IS. The instructions regarding migration from 7.5 explicitly state that jobs must be recompiled in IS 8.5. See the IS 8.5 Information Center at http://publib.boulder.ibm.com/infocente ... /index.jsp. Specifically: Installing->Migrating to InfoSphere Information Server, Version 8.5->Migrations from versions 7.5 and earlier
Although in your current production and UAT environments you only import the job executables (a failry common practice), best practice methodology should always maintain a copy of the source that generated those executable jobs, whether or not you actually import the source into the environment. May have been outside your control in your situation, tho.
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.