Certain bib records -- with a tag near 2000 bytes -- crash Cataloging client
- Article Type: General
- Product: Aleph
- Product Version: 18.01
Description:
We've discovered a bibliographic record which consistently crashes the Cataloging client.
We can view it in the web OPAC and GUI search. None of the services which normally detect too-long fields are having any problems with the record. But attempts to Catalog the record cause the Cataloging client to terminate.
The record has a 505 contents note close to 2000 characters. It appears this record hasn't been touched since our original conversion to ALEPH.
As a test we:
exported the record in MARC format using p_print_03;
used p_file_01 and p_file_02 to convert to ALEPH Sequential, then
used p_manage_18 to replace the record in the database.
The record still caused Cataloging to crash.
p_file_02 reports and splits long 505 tags. It seems that, in this case, it is not properly calculating how long is too long.
Resolution:
This issue has been passed to Development. We hope to have a solution soon. We will keep you posted.
There are two workarounds:
1. Similar to the above:
use p_print_03 to export the record in ALEPH Sequential format;
use "vi" to edit the record, breaking the long field into two shorter ones;
re-import the record with p_manage_18.
2. Edit the record in OCLC, breaking the long field into two shorter ones; export the record to ALEPH so it overlays the problem record.
- Article last edited: 10/8/2013