Skip to main content
ExLibris
  • Subscribe by RSS
  • Ex Libris Knowledge Center

    How should I BulkImport a large set of records?

    • Product Version: 7.1.0
    • Relevant for Installation Type: Total Care; Dedicated-Direct; Direct; Local

    Question

    What is the best way to import a large set of records? The set on hand contains 80,000 bibs.

    Answer

    Consider opening a Case with Support before embarking on a large bulk import project.  We can offer additional advice about Oracle, tablespace and other topics that can help to ensure your project is a success.  

    For a large set of records, it's best to break it up, skip keyword indexing, and have a keyword regen run when the imports are complete.

    1. Import a maximum of 10,000 records in one BulkImport process, and run only 1 process at a time.
    2. Break up the file by using the -b and -e parameters: Pbulkimport -f[filepath and name] -i[rule code] -b1 -e10000
    3. Skip keyword indexing by using the -XNOKEY parameter: Pbulkimport -f[filepath and name] -i[rule code] -b1 -e10000 -XNOKEY
    4. Schedule a regen:
      1. Open an incident with Voyager Support to request regen; OR
      2. Run regen manually if staff member is 300 Certified; OR
      3. Run regen via UTIL menu if on Voyager 8.1.0 or higher.

    For Voyager 9.0.0 and higher, the default is to skip keyword indexing; you do not need to specify -XNOKEY.

    Additional Information

    See Voyager UTIL Menu document and Voyager Technical User's Guide.

    See How many records can be imported at a time using bulkimport?


    • Article last edited: 18-Sep-2019
    • Was this article helpful?