You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This leads to as many insert requests as there are ngrams to be indexed. This becomes terribly slow when doing batch updates, (especially if one forgets to implement the update_if option)
I suggest the following
Use an implementation of batch update to add all the ngrams in a single request
(might be a an idea for a new feature) use a default implementation of default_if based on dirty tracking (if the indexed fields are Mongoid fields, then it's possible to use dirty tracking with _changed? methods to know if an update is needed
The text was updated successfully, but these errors were encountered:
Startouf
changed the title
Bulk indexing - Saving a model with a long indexable name triggers as many queries as the name length
Bulk indexing - Saving a model with a long indexable name triggers as many queries as the number of ngrams
Sep 3, 2020
100%. The searching code performance isn't great either. It generates one query for every nGram in the search term instead of using one query with multiple OR conditions, AND then one query for every search result to fetch the original object. Highly inefficient.
The update code triggers one insert per ngram
This leads to as many insert requests as there are ngrams to be indexed. This becomes terribly slow when doing batch updates, (especially if one forgets to implement the
update_if
option)I suggest the following
_changed?
methods to know if an update is neededThe text was updated successfully, but these errors were encountered: