I am developing an e-commerce project on angular where I need to implement search functionality. Now I heard about algoila and Elastic Search and we can use them as third party services for indexing the database.
Now usually what I use to do when I developed small scale applications, I use to maintain an array of combination of strings, characters, which was extracted from The actual string. Lets just give me an example, if I am storing a data for product, Where I have some fields like product name, product category, product department and etc.
So what will I do I will make a combination of strings from these three fields like, product name = "Some Product"; product category = "Some Category"; product department = "Some department"; then my result array will be [S, o, m, e, So, Som, Some, Some P, Some Pro, Some Prod, Some Produ, Some Produc, Some Product, Product, P, r, o, d, u, c, t, and so on for other fields too] and to fetch the result in client side I use to accept a string from user and query if that string exists in array of specific or set of documents. and then return the results.
"select name from PRODUCTS where productname LIKE '%" + keyword + "%'";
I am trying to implement above functionality by creating search tags for queries.
I saw some threads and stack overflow regarding the same, and
startAt and endAt
queries does not solve the issues, like the SQL query performs.
So my question is, if I dont want to use mentioned third party services and instead use my logic to implement search, then what will be the bottleneck for the logic in future in case of big data and high traffic for search requests.
For the note, I am not thinking about fuzzy searches right now. Cause if my logic is implementable and the bottleneck is tolerable, in next development phases I can maintain the search database by customers and there behaviours after not getting the result, that what they are manually opening. By machine learnig and creating new search tags to append the array of the search logic.