0

I am making a programme that needs to have a global search function. So for an example: A user puts a string in a search field and hits a button. The button has to trigger a function that searches through all my tables and columns (about 20 tables in my db) and find the row(s) containing the string and return them.

The problem is, what is the best way performance-wise to do this? At the moment I am thinking about a select statement per table (select * from table where column like '%searchstring%' OR like '%searchstring%' ...). This will make about 20 select-statements, I don't know how fast this would be when I would put it in a for-loop.

Any suggestions and ideas are welcome!

PS: I am using PHP 7 and mysql in trying to achieve this.

Other answers I have found didn't give me enough info to my liking.

BenRoob
  • 1,662
  • 5
  • 22
  • 24
ramon abacherli
  • 199
  • 2
  • 12
  • 2
    I would consider changing the db design instead of querying through 20 tables (and even more in future). – Darshan Mehta Nov 13 '17 at 14:10
  • 1
    You can list all columns in the database: `SELECT COLUMN_NAME, TABLE_NAME FROM INFORMATION_SCHEMA.COLUMNS` But something is definitely wrong if you need to do this. – vaso123 Nov 13 '17 at 14:11
  • There is no good way to do why you want. Maybe you could use Sphinx Search to do full text search but anyway you need to narrow the table / columns you want to search in – Daniel E. Nov 13 '17 at 14:30
  • You can have the statements built dynamically by querying system tables, but you will still be executing at least 20 statements, one for each table. I would agree with @DarshanMehta that perhaps your schema is the issue here. OR perhaps you might try a separate table of search terms (although that may get difficult to manage). – JNevill Nov 13 '17 at 14:33
  • how about this? https://stackoverflow.com/a/47225201/916000 – Taha Paksu Nov 13 '17 at 14:48
  • Thanks for the tips everyone – ramon abacherli Nov 13 '17 at 14:56

0 Answers0