0

When selecting rows from my sqlite database with Flask / sqlAlchemy with this function:

Restaurants.index.in_(filtered_res_ids))).fetchall())

I gt the results that I want when testing within pycharm. However, when deploying it to my AWS elastic beanstalk it gives me problems.

Here's some of my error log:

[Mon Dec 23 18:23:28.956847 2019] [:error] [pid 3451] WHERE restaurants_image_price_english."index" IN (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ? ... )


[2019-12-23 18:23:28,958] ERROR in app: Exception on /restaurantDiscover [POST]
Traceback (most recent call last):
File "/opt/python/run/venv/local/lib64/python3.6/site-packages/sqlalchemy/engine/base.py", line 1244, in _execute_context
cursor, statement, parameters, context
File "/opt/python/run/venv/local/lib64/python3.6/site-packages/sqlalchemy/engine/default.py", line 552, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: too many SQL variables

Should I use a different implementation?

SumakuTension
  • 542
  • 1
  • 9
  • 26

1 Answers1

3

According to the docs sqlite allows max 999 variables in a query

SQLite allocates space to hold all host parameters between 1 and the largest host parameter number used. Hence, an SQL statement that contains a host parameter like ?1000000000 would require gigabytes of storage. This could easily overwhelm the resources of the host machine. To prevent excessive memory allocations, the maximum value of a host parameter number is SQLITE_MAX_VARIABLE_NUMBER, which defaults to 999.

This value can't be increased without recompiling sqlite.

filtered_res_ids probably contains more than that.

You will have to split it to chunks of 999 elements and use multiple queries. Hard-coded example:

# first 999
Restaurants.index.in_(filtered_res_ids[:999]))).fetchall())

# last 999
Restaurants.index.in_(filtered_res_ids[999:]))).fetchall())

Of course if filtered_res_ids contains more than 1998 elements than you will need to dynamically split it to chunks of max 999 elements, as explained in answers to this question.

EDIT

Since you mentioned that the same code works locally, it is very possible that the remote server you deployed your application on runs a recompiled version of sqlite with a lower value for SQLITE_MAX_VARIABLE_NUMBER (or it lowers it during runtime as the docs suggest it is possible).

You might have to dig in AWS docs to find the max value they allow, or alternatively play around with your code until you find a value that works.

DeepSpace
  • 78,697
  • 11
  • 109
  • 154
  • Good to know, thanks! Still peculiar how this error only happens when the code is deployed and not in debug mode on my computer – SumakuTension Jan 02 '20 at 09:09
  • 1
    @SumakuTension I added an explanation why that might happen – DeepSpace Jan 02 '20 at 09:10
  • 1
    One option that might work in SQLA 1.4 would be to use an expanding bindparam with literal_execute: https://stackoverflow.com/questions/57829682/new-override-sqlalchemy-operator-compiler-output – Ilja Everilä Jan 02 '20 at 12:51