- Article Type: Q&A
- Product: Aleph
- Product Version: 20
Is there any limit on the number of records that can be loaded by p_manage_18 in one batch, that is, is there a need to break the input file up into multiple batches?
There is no special limit to the number of records which can be loaded via manage-18, but there can be a problem with resulting ue_01 activity slowing the system down. As described in Article 000044270 ( ue_01 slow (after load of "ebrary" records) -- link below), this can especially be a problem with e-resource bib records. (This, of course, does not apply to authority records.)
If you are specifying "Full" indexing in the manage-18 submission, I suggest doing them in groups over several nights or a weekend -- definitely if e-resource records.
But if (as in the case of a complete authority library reload) you are specifying "None II" for indexing, and are planning run the batch indexing jobs afterwards, the manage-18 could be run for all of them at once.
Article link: ue_01 slow (after load of "ebrary" records)
Category: Background processing (500)
- Article last edited: 2/5/2015