I have an Postgres table with around 70 millions row.I've never worked with such a large amount of data before.
My data are Stock option where the composit key is something like this : AAPL.US where AAPL ( apple ) the symbol of the Stock and US the region where it's traded.
When trying to do a custom query using the composit key such as
SELECT * FROM stock_prices WHERE composit LIKE '%.US'
It is extremely slow actually often simply crashing because it's too long. minimum of 10-15 minutes at very best.
I'm trying to find a way to index the process since it looks like it could be fastened.
There is only around 70 or so different " regions " for a total of 70 millions row.
I've tried indexing the " composit key " this way but that didn't do much
CREATE INDEX idx_composit ON stock_prices(composit);
Any query on the dataset is very very slow to say the least so my purpose in doing this is to be able to 'organize' my data in a better way for subsequent queries so that any simple query doesn't take ~15 minutes+++ / crash.