- Product: Voyager
- Product Version: All
- Relevant for Installation Type: Multi-Tenant Direct, Dedicated-Direct, Local, Total Care
When importing records, what is the maximum number of records that should be imported in a single file?
For optimum importing performance, import 10,000 records (or less) at one time. If your record file is larger than 10,000 records, it should be broken into smaller sets of records (using the -b and -e parameters) and then imported one after the other.
Sites may elect to import more than 10,000 records, but this is not recommended best practice, and should be done judiciously (i.e., increase size incrementally rather than jump from 10,000 records to 100,000).
If issues occur with larger batch sizes, Support will request as a first step in troubleshooting that the site move back to importing 10,000 records or fewer and see if the issue occurs with the recommended batch size.
This rule of thumb of limiting imports to groups of 10,000 records (or less) also applies to Global Data Change, which uses Bulk Import.
Consider opening a Case with Support before embarking on a large import project. We can offer additional advice about Oracle, tablespace and other topics that can help to ensure your project is a success
This advice is based on an understanding that in certain bulk imports (depending on the arguments used in the import rule and duplicate detection profile), bulkimport may leak memory leading to program failure. If the operation you are performing does not leak memory, or doesn't leak too much memory, then it will succeed. There is no easy way to monitor if your program is leaking memory.
One suggested course of action is to use the training database for benchmark testing. This should provide a fair estimate of how similar imports will perform in production, what the boundaries are for your particular environment, and provide an opportunity to adjust any import plans before your production environment is impacted.
- Article last edited: 22-Jan-2019