In the data that I have, there are around 1M rows, each with around 60-70 columns. However only few rows(20-30) will have columns beyond 30 filled, i.e, the table is sparse.Also columns beyond 30 are rarely queried.
Does "number of columns" impact performance?
Should I make two tables? one with first 30 columns and the second table is the original table.
or should I keep the original structure?
Table schema :- Table entity_table ( entity_id int, tag_1 text, tag_2 text, . . . tag_30 text, --upto col. 30 table is dense tag_31 text, . . . tag_70 text --sparse columns );
Also, does the type of these columns affect performance. Does postgres index null values, how to prevent that?