Skip to main content
  • Subscribe by RSS
  • Ex Libris Knowledge Center

    Using Oracle Data Pump on live production server

    • Article Type: General
    • Product: Aleph
    • Product Version: 18.01

    We may be acting over cautious here, but we wanted to confirm with you that we can indeed initiate the Oracle Data Pump via the Upgrade Express on our live production server with no ill effects to Aleph.

    In other words, can Aleph and Oracle both be up on production when we launch this script?

    As described at ,
    "Data Pump technology fully uses all available resources to maximize throughput and minimize elapsed job time.... The Data Pump Export and Import utilities enable you to dynamically increase and decrease resource consumption for each job. This is done using the PARALLEL parameter to specify a degree of parallelism for the job."

    We recommend that Aleph (the www_server, pc_server, etc.) be down when you are extracting the data. If you are importing the data on a different server, that component is a non-issue. If importing on the same server, you may want to minimize the parallelism....

    My experience has been that it's a very heavy user of system resources. It won't keep other things from happening, just slow them down. It runs extremely fast. You should be able to export your entire database in one night. We have found "For a ~400GB DB the dpdump export takes for us 1 hour and 40 min."

    Additional Information


    • Article last edited: 10/8/2013