- Product Version: 7.1.0
- Relevant for Installation Type: Total Care; Dedicated-Direct; Direct; Local
What is the best way to import a large set of records? The set on hand contains 80,000 bibs.
Consider opening a Case with Support before embarking on a large import project. We can offer additional advice about Oracle, tablespace and other topics that can help to ensure your project is a success.
For a large set of records, it's best to break it up, skip keyword indexing, and have a keyword regen run when the imports are complete.
- Import a maximum of 10,000 records in one BulkImport process, and run only 1 process at a time.
- Break up the file by using the -b and -e parameters:
Pbulkimport -f[filepath and name] -i[rule code] -b1 -e10000
- Skip keyword indexing by using the -XNOKEY parameter:
Pbulkimport -f[filepath and name] -i[rule code] -b1 -e10000 -XNOKEY
- Schedule a regen:
- Open an incident with Voyager Support to request regen; OR
- Run regen manually if staff member is 300 Certified; OR
- Run regen via UTIL menu if on Voyager 8.1.0 or higher.
For Voyager 9.0.0 and higher, the default is to skip keyword indexing; you do not need to specify -XNOKEY.
- Article last edited: 18-Sep-2019