12

I have a Pylons project and a SQLAlchemy model that implements schema qualified tables:

class Hockey(Base):
    __tablename__ = "hockey"
    __table_args__ = {'schema':'winter'}
    hockey_id = sa.Column(sa.types.Integer, sa.Sequence('score_id_seq', optional=True), primary_key=True)
    baseball_id = sa.Column(sa.types.Integer, sa.ForeignKey('summer.baseball.baseball_id'))

This code works great with Postgresql but fails when using SQLite on table and foreign key names (due to SQLite's lack of schema support)

sqlalchemy.exc.OperationalError: (OperationalError) unknown database "winter" 'PRAGMA "winter".table_info("hockey")' ()

I'd like to continue using SQLite for dev and testing.

Is there a way of have this fail gracefully on SQLite?

Chris Reid
  • 151
  • 1
  • 5
  • 2
    What difficulties you have with Postgres on your dev and test machines? – Milen A. Radev Apr 21 '10 at 19:31
  • 4
    I would make your setup simpler. Make use of Postgres from end to end. Don't test on SQLIte and release on Postgres. – Kuberchaun Apr 21 '10 at 19:48
  • 1
    Sticking with SQLite makes for an easier workflow, especially for the QA folks as they don't have to be aware of their testing db. Just going with Postgres everywhere is a solid plan "B" – Chris Reid Apr 21 '10 at 21:12
  • 3
    Have you tried using `engine.execute("attach database '{db}' as winter;".format(db=_SL_FILE))` to hack around this problem? BTW, I agree with StarShip3000's comment regarding PostgreSQL. – stephan Apr 27 '10 at 17:57
  • @stephan How would one do that with an in-memory SQLLite DB? – Chris R Aug 04 '10 at 22:10
  • 3
    @Chris: to attach a *newly created* in-memory db to an existing (in-memory or file) db, you just run `engine.execute("attach database ':memory:' as db_name;")`. I don't know of a way to attach an *already existing* in-memory db (attaching an existing file-based db to an in-memory one is no problem). So you basically have to change the order of creation: attach the in-memory db first (which creates a new one), and then create tables for this new in-memory db and fill with data as needed. – stephan Aug 05 '10 at 05:54

4 Answers4

11

I'd like to continue using SQLite for dev and testing.

Is there a way of have this fail gracefully on SQLite?

It's hard to know where to start with that kind of question. So . . .

Stop it. Just stop it.

There are some developers who don't have the luxury of developing on their target platform. Their life is a hard one--moving code (and sometimes compilers) from one environment to the other, debugging twice (sometimes having to debug remotely on the target platform), gradually coming to an awareness that the gnawing in their gut is actually the start of an ulcer.

Install PostgreSQL.

When you can use the same database environment for development, testing, and deployment, you should.

Not to mention the QA team. Why on earth are they testing stuff they're not going to ship? If you're deploying on PostgreSQL, assure the quality of your work on PostgreSQL.

Seriously.

Mike Sherrill 'Cat Recall'
  • 91,602
  • 17
  • 122
  • 185
  • My edit came across as a new answer. This answer has my full text. The other is missing my conspicuous concern for all the fine people in QA. – Mike Sherrill 'Cat Recall' Jan 23 '11 at 00:04
  • If a server that everyone can access for development and testing is available, PostgreSQL could be installed on there instead of locally on all machines. This would easy the burden of testers since they wouldn't need to install it. – jpmc26 Dec 19 '12 at 00:52
  • 2
    A big part of this question are unittests, creating a clean db for every test is not really a feasible solution with postgres. Maybe this is one of the reason why people seeking for an answer and cannot stop. – Gabor Jan 24 '23 at 11:27
6

I'm not sure if this works with foreign keys, but someone could try to use SQLAlchemy's Multi-Tenancy Schema Translation for Table objects. It worked for me but I have used custom primaryjoin and secondaryjoinexpressions in combination with composite primary keys.

The schema translation map can be passed directly to the engine creator:

...

if dialect == "sqlite":
    url = lambda: "sqlite:///:memory:"
    execution_options={"schema_translate_map": {"winter": None, "summer": None}}
else:
    url = lambda: f"postgresql://{user}:{pass}@{host}:{port}/{name}"
    execution_options=None

engine = create_engine(url(), execution_options=execution_options)

...

Here is the doc for create_engine. There is a another question on so which might be related in that regard.

But one might get colliding table names all schema names are mapped to None.

P. B.
  • 587
  • 6
  • 12
6

I know this is a 10+ year old question, but I ran into the same problem recently: Postgres in production and sqlite in development.

The solution was to register an event listener for when the engine calls the "connect" method.

@sqlalchemy.event.listens_for(engine, "connect")
def connect(dbapi_connection, connection_record):
    dbapi_connection.execute('ATTACH "your_data_base_name.db" AS "schema_name"')

Using ATTACH statement only once will not work, because it affects only a single connection. This is why we need the event listener, to make the ATTACH statement over all connections.

Riqq
  • 137
  • 1
  • 4
4

I'm just a beginner myself, and I haven't used Pylons, but...

I notice that you are combining the table and the associated class together. How about if you separate them?

import sqlalchemy as sa
meta = sa.MetaData('sqlite:///tutorial.sqlite')
schema = None
hockey_table = sa.Table('hockey', meta,
                      sa.Column('score_id', sa.types.Integer, sa.Sequence('score_id_seq', optional=True), primary_key=True),
                      sa.Column('baseball_id', sa.types.Integer, sa.ForeignKey('summer.baseball.baseball_id')),
                      schema = schema,
                    )

meta.create_all()

Then you could create a separate

class Hockey(Object):
    ...

and

mapper(Hockey, hockey_table)

Then just set schema above = None everywhere if you are using sqlite, and the value(s) you want otherwise.

You don't have a working example, so the example above isn't a working one either. However, as other people have pointed out, trying to maintain portability across databases is in the end a losing game. I'd add a +1 to the people suggesting you just use PostgreSQL everywhere.

HTH, Regards.

Faheem Mitha
  • 6,096
  • 7
  • 48
  • 83
  • 1
    What about those people who have a cooperation policy of using MSSQL (Windows product) while app development happens on the Linux side? While it would be really cool to use PostgreSQL given the amount of data that is on the current data, migration is far in the future. – Thornhale Aug 18 '17 at 23:18