The setup:
My Rail 3.2.12 app uses Rspec & FactoryGirl for testing most of its various model methods.
I use Postgres for production, development, and test.
Problem:
One model, ZipCodes, has about 40,000 rows including longitude/latitude data, and a LOTS of methods for selecting records that meet certain criteria.
I need to test those methods against the real-world data. Because of the size, re-loading the table each time we run tests is far too slow.
Question:
How can I load that static ZipCodes table, via terminal or console or rake task, once, then leave it alone unless/until the data changes (every few months we might add a few zipcodes). (Also, not erasing it with DatabaseCleaner after each test run, although I got that answered in another question, that I can use the :except => [tablename].)
I have the data in CSV and YAML formats now, but can move it to any other format if necessary.
I also have the data loaded into my development database, if there's some way to copy it from the dev to test databases.
(note: we do not use the primary key for ANY associations, we do all lookups by other fields like zipcode or longitude, so it doesn't matter if the method of loading the data into test brings over primary keys from my development database)