I have no experience at all with sql/postgres indexing so I'm even unsure how to formulate my question correctly. But here goes...
In one of our JpaRepositories we have a query that looks like this:
@Query(nativeQuery = true, value =
"SELECT * FROM instrument i " +
"left join instrument_profile ip ON ip.instrument_id = i.id " +
"WHERE (ip.profile IS NULL or upper(ip.profile) = :assetMgr)" +
" and i.transaction_type = :transactionType "+
" and i.customer_id = :customerId " +
" and ( i.bloomberg_id ilike %:key% OR " +
" i.instrument_short ilike %:key% or " +
" i.instrument_name ilike %:key% OR " +
" i.cusip ilike %:key% OR " +
" i.sedol ilike %:key%) " +
"limit :limit")
The query works find but we are now looking for ways to optimize performance overall and table indexing is one of them. The thing is.. I have no clue how to index a table for this kind of query. Partly because it contains a join table, and also we are searching for a "search-key" value in multiple fields. This particular table is expected to contain a lot of records so if we cant create an index that supports the above, we could be forced to change the entire query.
Any suggestions or pointers on how to approach this would be very much appreciated.
/Kat