- Article Type: General
- Product: Aleph
- Product Version: 18.01
In order to have sufficient space to export Oracle Data using the Data Pump / v19-20 upgrade utility, we established a shared mount directory between our current v18 production server and our new v20 box.
The Upgrade Express utility seems to be hard-coded to export all u-tree and oracle data to the upgrade kit's root directory, which the instructions say to install in the /exlibris/aleph directory. We do *not* have sufficient space to export our full oracle database in the general file system.
Is there a configuration option to tell upgrade express where to dump export files? We also want to by-pass the "FTP to target server" stage of the process, since we have a SAN disk mounted between both servers.
As a workaround, we are in the process of configuring our mount to use the path /exlibris/aleph/upgrade_express_1901_2001 on both servers, then copy the upgrade express tar contents to this mount and run it *from* the SAN mount. Our only concern is the two instances of aleph sharing the same instance of upgrade express and related log and status files. (We have a "two-task" set-up, with Oracle on a separate server.)
Is there another way to use a shared disk between to servers to mitigate space constraints and bypass the FTP step of the upgrade process?
It seems to me that having the two instances of aleph share the same instance of Upgrade Express is *not* a problem -- since Upgrade Express is sensitive to the instance it is operating in and will produce a different upgrade_util menu and write different files depending on which one it is in. You just need to be certain to connect to the proper instance. (This can easily be seen because you see a different unix prompt in each case.)
Jame Mitchell, on the U.S. staff says this: "that's totally fine. There's no reason not to share that copy of Upgrade express between the exporting and importing instance."
James adds: "One more thought is that that filesystem will probably have to be shared with the database server too - I'm betting that the data pump scripts are only written with a single server configuration in mind."
- Article last edited: 10/8/2013