I created a model with rails g model project name # with more columns
Whenever I run explain on the sql query postgres never uses the index. Why is that?
my_db=# \d projects
Table "public.projects"
Column | Type | Collation | Nullable | Default
-------------+-----------------------------+-----------+----------+--------------------------------------
id | integer | | not null | nextval('projects_id_seq'::regclass)
name | character varying | | |
active | boolean | | not null | true
team_id | integer | | |
created_at | timestamp without time zone | | not null |
updated_at | timestamp without time zone | | not null |
account_id | integer | | |
Indexes:
"projects_pkey" PRIMARY KEY, btree (id)
"index_projects_on_account_id" btree (account_id)
"index_projects_on_team_id" btree (team_id)
Foreign-key constraints:
"fk_rails_b4884d7210" FOREIGN KEY (account_id) REFERENCES accounts(id)
"fk_rails_ecc227a0c2" FOREIGN KEY (team_id) REFERENCES teams(id)
Referenced by:
TABLE "products" CONSTRAINT "fk_rails_7327dd6d39" FOREIGN KEY (project_id) REFERENCES projects(id)
And when I search I get this:
my_db=# explain select name from projects where id = 28;
QUERY PLAN
---------------------------------------------------------
Seq Scan on projects (cost=0.00..2.60 rows=1 width=32)
Filter: (id = 28)
(2 rows)
I ask ActiveRecord about the index and get false
:
ActiveRecord::Base.connection.index_exists?(:projects, :id)
=> false
projects_pkey
is right there under Indexes and yet Postgres is using a Seq Scan
on projects. I have tried to change the query search for ids greater than 100000 or less than 100000. It doesn't matter, it's always using a Seq Scan
Can someone explain why it isn't using the index? On a small dataset this is fine. However the question came about from running a query that joins many tables and running explain on each of those still uses a Seq Scan
which got me asking. (Regarding the query, it is joining about 10 tables with a few through tables. The dataset is not large, just wide)