BULK: coredump with Replace duplicate handling and large record files
- Product: Voyager
- Product Version: 9.1.1, 9.2.0
- Relevant for Installation Type: Dedicated-Direct,Direct,Local,Total Care
Import using Replace duplicate handling and either
- import is for 15,000+ records; or
- multiple simultaneous imports of 10,000 records
results in coredump with message: /tmp/Pdobulkimport.20151230.1007.19380: line 2: 19392: Abort(coredump)
Issue VYG-6902 is currently in Development.
- System Administration > Cataloging > Bibliographic Duplicate Detection > find or set up dup profile to Replace records.
- System Administration > Cataloging > Bulk Import Rules > create rule using dup profile from step 1 and Bib/Auth only profile.
- Import attached file with rule from step 2.
- Import will coredump before completing.
- Import attached file, breaking it into multiple simultaneous imports of 10,000 records (with -b and -e params, and -M to run multiple)
- Some imports will complete successfully, and others will coredump.
Restart imports from record where coredump occurred.
Note Support recommends importing 10,000 or fewer records per process. Adhering to recommendation should also help in working around this issue. See How many records can be imported at a time using bulkimport?
- Article last edited: 22-Jan-2019