Skip to main content
ExLibris
  • Subscribe by RSS
  • Ex Libris Knowledge Center

    BULK: coredump with Replace duplicate handling and large record files

    • Product: Voyager
    • Product Version: 9.1.1, 9.2.0
      • Relevant for Installation Type: Dedicated-Direct,Direct,Local,Total Care

    Symptoms

    Import using Replace duplicate handling and either

    • import is for 15,000+ records; or
    • multiple simultaneous imports of 10,000 records

    results in coredump with message: /tmp/Pdobulkimport.20151230.1007.19380: line 2: 19392: Abort(coredump) 

    Defect Status

    Issue VYG-6902 is currently in Development.

    Replication Steps

    1. System Administration > Cataloging > Bibliographic Duplicate Detection > find or set up dup profile to Replace records.
    2. System Administration > Cataloging > Bulk Import Rules > create rule using dup profile from step 1 and Bib/Auth only profile.
    3. Import attached file with rule from step 2.
    4. Import will coredump before completing.
    5. Import attached file, breaking it into multiple simultaneous imports of 10,000 records (with -b and -e params, and -M to run multiple)
    6. Some imports will complete successfully, and others will coredump.

    Workaround

    Restart imports from record where coredump occurred. 

     

    Additional Information

    Note Support recommends importing 10,000 or fewer records per process. Adhering to recommendation should also help in working around this issue. See How many records can be imported at a time using bulkimport?


    • Article last edited: 22-Jan-2019
    • Was this article helpful?