1

I have an Postgres table with around 70 millions row.I've never worked with such a large amount of data before.

My data are Stock option where the composit key is something like this : AAPL.US where AAPL ( apple ) the symbol of the Stock and US the region where it's traded.

When trying to do a custom query using the composit key such as

SELECT * FROM stock_prices WHERE composit LIKE '%.US'

It is extremely slow actually often simply crashing because it's too long. minimum of 10-15 minutes at very best.

I'm trying to find a way to index the process since it looks like it could be fastened.

There is only around 70 or so different " regions " for a total of 70 millions row.

I've tried indexing the " composit key " this way but that didn't do much

CREATE INDEX idx_composit ON stock_prices(composit);

Any query on the dataset is very very slow to say the least so my purpose in doing this is to be able to 'organize' my data in a better way for subsequent queries so that any simple query doesn't take ~15 minutes+++ / crash.

mich.zt
  • 13
  • 2
  • Try the following page: PostgreSQL LIKE query performance variations https://stackoverflow.com/questions/1566717/postgresql-like-query-performance-variations –  May 13 '19 at 12:15

1 Answers1

1

You could try making it SARGable by using function-based index:

CREATE INDEX idx ON stock_prices(REVERSE(composit));

SELECT * FROM stock_prices WHERE REVERSE(composit) LIKE REVERSE('%.US');

db<>fiddle demo

Lukasz Szozda
  • 162,964
  • 23
  • 234
  • 275