Skip to main content
ExLibris
  • Subscribe by RSS
  • Ex Libris Knowledge Center

    too many entries" in ccl search results

    • Article Type: General
    • Product: Aleph
    • Product Version: 16.02

    Description:
    When I do this ccl search:
    wyr=1970->2009
    I get 51,086 hits.

    But when I do this search:
    wlc=q?
    I get error message "too many entries".

    If I do the search:
    (wyr=1970->2009) and wlc=q?
    I still get "too many entries".

    I have two questions about this. Firstly, I don't understand why the last search hits a limit, because it should match fewer records than the first search (which did not hit a limit). Why is it giving the "too many entries" message?

    Also, the first search gives me over 51,000 hits and the OPAC displays them. Given the limits we have set (set_word_limit = 1000 and set_result_set_limit = 20000), why does this search not seem to hit either of these limits?

    Resolution:
    You will get the "too many entries" message whenever you include a term that hits the word limit (set_word_limit). It doesn't matter that the final result would be small; the inclusion of such a term will make your search fail in this way. In your case, the term, "wlc=q?" is causing this message in the combined search, just as it causes it when used by itself. The limit determines how many separate words are examined to match the search. So, in the case of wlc=q?, the words "qatar", "quadrant", "quell", etc. would all count as separate words, contributing towards the limit.

    You can address this problem, although it might still occur on some searches. The variable that controls this message is "set_word_limit", which you can set in your pc_server_defaults file for the GUIs and in www_server.conf (both in the $alephe_root directory) for the web. You currently have this variable set to 10000 in both of these files. Unfortunately, the variable is limited to 4 digits, so its maximum value is 9999. By setting it to 10000, you have gone beyond the maximum value, so it gets set in your system to the default, which is 1000. If you reduce it to 9999 or to some other value below 10000, you may find that searches like "wlc=q?" will now return results. Of course, such searches will use more system resources than they do now, and relatively meaningless searches like "wrd=a?" will spend more resources before failing anyway. You can try resetting the value and seeing what results you get. If the system is noticeably slowed by the change, you can try reducing it to maybe 5000 or 3000 to see if this works better for you. You will have to restart the www_server or pc_server after making such a change before it takes effect.

    The reason your other search does not hit the set_word_limit maximum is that set_word_limit comes into play only for truncated terms. Since your search on "wyr=1970->2009" does not have a truncated term, set_word_limit would not apply. In that case, only the overall result limit, set_result_set_limit, would apply. It limits you to only 20000 records that will actually display. Although it finds more than 51,000 records, you will see that it will only display the first 20,000 of these.


    • Article last edited: 10/8/2013