Skip to main content
ExLibris
  • Subscribe by RSS
  • Ex Libris Knowledge Center

    Fixing large numbers of item records

    • Article Type: General
    • Product: Aleph
    • Product Version: 19.01

    Description:
    When our serial item data was converted to Aleph in 2002, our z30_chronological_i was erroneously placed in z30_enumeration_b (if no other enum_b information was present). We are looking into creating summary holdings for export to OCLC and are thinking this might be a good time to clean up this data. There are around 800,000 item records involved.

    Would some kind of SQL update be a ‘safe’ way to change the data, especially when it involves this many records? Any suggestions for an update or other batch method to move this data would be greatly appreciated.

    Resolution:
    p_manage_62 can be run to do certain item updates,... but not for this field.

    You could use SQL to copy the z30_enumeration_b to z30_chronological_i (or ???) in the cases where this is appropriate. Then you could run the following SQL:

    update z30 set z30_enumeration_b = null where ...;

    You should back up the z30 table with p_file_03 before doing any such update.

    If you can do it in one fell swoop, that's fine.... But you may find that the SQL will fail (with a "rollback" error). If so, you will need to do it in smaller pieces.


    • Article last edited: 10/8/2013