Skip to main content
ExLibris
  • Subscribe by RSS
  • Ex Libris Knowledge Center

    Harvesting a record fails with an UncategorizedSQLException error on the P_DEDUP_VECTOR table

    The dedup vectors are limited to 4000 characters. Use the GetHeadTail transformation to trim the vector.
    • Product: Primo
    • Product Version: Primo November 2015, Primo February 2016
    • Relevant for Installation Type: Dedicated-Direct, Direct, Local

     

    Description

    1. A record is in the list of failed records for a Pipe
    2. The record has either
      1. a title over 4000 characters long
      2. many ISBN10 or ISBN13 values whose length is over 4000 characters
      3. other data used for DeDup vectors that is over 4000 characters
    3. The error message is:
    org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback; 
    uncategorized SQLException for SQL [INSERT INTO P_DEDUP_VECTOR (id, SRCID, t, c1, c2, c3, c4, c5, c6, c7, c8, c9, c10, f1, f2, f3, f4, f5, f6, f7, f8, f9, f10, f11, f12, f13, f14, f15, f16, f17, f18, f19, f20, c1_hash, c2_hash, c3_hash, PUBLISHING_WORK_ID, CUSTOMERID, INSTITUTIONID, LIBRARYID, c5_hash) VALUES (HIBERNATE_SEQUENCE.NEXTVAL, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]; SQL state [72000]; error code [1461]; ORA-01461: can bind a LONG value only for insert into a LONG column ; 
    nested exception is java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column ,
    java.lang.RuntimeException: org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback; 
    uncategorized SQLException for SQL [INSERT INTO P_DEDUP_VECTOR (id, SRCID, t, c1, c2, c3, c4, c5, c6, c7, c8, c9, c10, f1, f2, f3, f4, f5, f6, f7, f8, f9, f10, f11, f12, f13, f14, f15, f16, f17, f18, f19, f20, c1_hash, c2_hash, c3_hash, PUBLISHING_WORK_ID, CUSTOMERID, INSTITUTIONID, LIBRARYID, c5_hash) VALUES (HIBERNATE_SEQUENCE.NEXTVAL, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]; SQL state [72000]; error code [1461]; ORA-01461: can bind a LONG value only for insert into a LONG column ; 
    nested exception is java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService$5.doInTransaction(AbstractInsertUpdateService.java:592)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService$1.doInTransaction(AbstractInsertUpdateService.java:74)
     at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:130)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService.doInNewTransaction(AbstractInsertUpdateService.java:71)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService.doOneRecordAndCommit(AbstractInsertUpdateService.java:587)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService.insertOrUpdateOneByOne(AbstractInsertUpdateService.java:534)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService.insertOrUpdateOneByOneMode(AbstractInsertUpdateService.java:471)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService.insertOrUpdateBatchWithFailOver(AbstractInsertUpdateService.java:333)
     at com.exlibris.primo.manager.service.AbstractInsertUpdateService.insertOrUpdateBatchInNewTransaction(AbstractInsertUpdateService.java:196) at com.exlibris.primo.manager.DedupVectorManager.insertOrUpdateBulkInNewTransaction(DedupVectorManager.java:64) at com.exlibris.primo.publish.platform.nep.persistence.PersistenceTask.insertDedupVectors(PersistenceTask.java:581) at com.exlibris.primo.publish.platform.nep.persistence.PersistenceTask.insertAndUpdateRecords(PersistenceTask.java:272) at com.exlibris.primo.publish.platform.nep.persistence.PersistenceTask.execute(PersistenceTask.java:114) at com.exlibris.primo.publish.process.interceptor.NewSpringTransactionExecutionService$1.doInTransaction(NewSpringTransactionExecutionService.java:20) at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:130) at com.exlibris.primo.publish.process.interceptor.NewSpringTransactionExecutionService.execute(NewSpringTransactionExecutionService.java:17) at com.exlibris.primo.publish.process.interceptor.exception.AbstractExceptionHandlerInterceptor.execute(AbstractExceptionHandlerInterceptor.java:15) at com.exlibris.process.impl.AbstractWrappingTaskInterceptor.execute(AbstractWrappingTaskInterceptor.java:11) at com.exlibris.process.impl.AbstractWrappingTaskInterceptor.execute(AbstractWrappingTaskInterceptor.java:11) at com.exlibris.process.impl.Abs
    

    Resolution

    This error happens because DeDup vectors (e.g. dedup/f7, dedup/c2, & dedup/f3) are limited to 4000 characters. The solution is to limit the data to 4000 characters.

    To trim a long field (e.g. title):

    This modification takes the first 3000 characters and last 1000 characters of the long field.

    1. Go to Back Office > Ongoing Configuration Wizards > Pipe Configuration Wizard
    2. Click Edit next to the Normalization Rule Set that needs to be updated
    3. Select Dedup in the PNX Section drop down
    4. Click Edit on the row of the DeDup vector (e.g. dedup:f7)
    5. Click the Advanced button
    6. Click the + (plus) symbol located next to the last Transformation row
    7. Set the last Transformation as:
      Transformation Parameter
      GetHeadTail 3000@@1000
    8. Click Save then Go Back to save the changes
    9. Click Go Back to return to the list of Normalization Rule Sets
    10. Click Deploy next the Normalization Rule Set that was just updated
    To trim a combination of many small fields (e.g. ISBN13):

    This modification combines the data into another field then copies the first 3000 and last 1000 characters

    1. Go to Back Office > Ongoing Configuration Wizards > Pipe Configuration Wizard
    2. Click Edit next to the Normalization Rule Set that needs to be updated
    3. Select an unused local field (e.g. display:lds50, addata:lad25)
    4. Copy the existing DeDup Normalization Rules to the unused local field
    5. Edit the rule of the DeDeup vector (e.g. dedup:f3)
    6. Disable all existing rules
    7. Under Create new source mapping click Create
    8. Set the rule as follows:
      Source:
      Type PNX
      Value <the unused local field, e.g. display/lds50>

       

      Conditions: None

      Transformation:

      Transformation Parameter
      GetHeadTail 3000@@1000

       

    9. Click Save then Go Back to save the changes
    10. Click Go Back to return to the list of Normalization Rule Sets
    11. Click Deploy next the Normalization Rule Set that was just updated

     


    • Article last edited: 07-Feb-2017