I have a table in a postgres database that's being used in a testing environment where we need to add and drop several columns at a time. The problem is that postgres has a maximum of 1600 columns, and this count includes dropped columns. My table will never have 1600 total `un-dropped' columns, but over time it accumulates to more than the 1600 with drops.
I've tried using VACUUM
and VACUUM FULL
, and I've tried recasting an existing column as its own type (ALTER TABLE table ALTER COLUMN anycol TYPE anytype
) to cause postgres to scan all of the columns and clean up the memory from dropped columns, but none of these reset postgres' `column number.'
I know that this could be solved by copying the entire table, but that has its own issues, and is in a separate question.
Do you know of a way to make postgres forget it had dropped columns?
I know that postgres was not designed for applications like this, but I'm not going to get into why we chose to implement it this way. If you have an alternate tool to use, I'd be interested to hear about it, but I'd still like to find a solution to this.