2

I have django application, running under lighttpd via fastcgi. FCGI running script looks like:

python manage.py runfcgi socket=<path>/main.socket
    method=prefork \                                                                                                    
    pidfile=<path>/server.pid \                                                                                                                                        
    minspare=5 maxspare=10 maxchildren=10 maxrequests=500 \

I use SQLite. So I have 10 proccess, which all work with the same DB. Next I have 2 views:

def view1(request)
    ...
    obj = MyModel.objects.get_or_create(id=1)
    obj.param1 = <some value>
    obj.save ()

def view2(request)
    ...
    obj = MyModel.objects.get_or_create(id=1)
    obj.param2 = <some value>
    obj.save ()

And If this views are executed in two different threads sometimes I get MyModel instance in DB with id=1 and updated either param1 or param2 (BUT not both) - it depends on which process was the first. (of course in real life id changes, but sometimes 2 processes execute these two views with same id)

The question is: What should I do to get instance with updated param1 and param2? I need something for merging changes in different processes.

One decision is create interprocess lock object but in this case I will get sequence executing views and they will not be able to be executed simultaneously, so I ask help

DUPE OF Django: How can I protect against concurrent modification of data base entries

Community
  • 1
  • 1
iKiR
  • 860
  • 2
  • 8
  • 20
  • 2
    SQLite is typically considered unsuitable as a backend outside of testing contexts; migrating to PostgreSQL or MySQL is going to be the path of least resistance. – Hank Gay May 26 '10 at 18:21
  • 1
    possible duplicate of [Django: How can I protect against concurrent modification of data base entries](http://stackoverflow.com/questions/320096/django-how-can-i-protect-against-concurrent-modification-of-data-base-entries) – Hank Gay May 26 '10 at 18:23
  • Hm... may be this discussion can be usefull. I can add timestamp field in my model and update it before save. Successfull update means that no other threads/processes have not changed my object, so I can call save method. But if update was failed this means that other thread or process have changed it and I should re-get it from DB and try merge changes. Am I right? P.S: But it very strange that Django hasn't anything for solving such problems... Can, all developers have invented their own solution? It's very strange – iKiR May 27 '10 at 05:41

1 Answers1

1

SQLite is not good choice if you need such concurrent access to database. I suggest switching to some other rdbms, like MySQL or PostgreSQL, and also take into account get_or_create fragility:

How do I deal with this race condition in django?

Regarding the above link, there is also second solution to that problem - using READ COMMITED isolation level, instead of REPEATABLE READ. But it's less tested (At least in MySQL), so there might be more bugs/problems with it.

Community
  • 1
  • 1
Tomasz Zieliński
  • 16,136
  • 7
  • 59
  • 83
  • Thank you for your reply. Hm... and how this problem is solving in MySQL or PostgreSQL? About get_or_create - I've read this question and I make own implementation of get_or_create (via subclassing Manager), in this implementation I call super implementation of this method if it raise IntegrityError, I catch it, commit transaction and call get_or_create again. Could you tell me how use READ COMMITED isolation level in SQLite? – iKiR May 27 '10 at 03:40
  • I don't think SQLite implements isolation level in traditional sense: http://www.sqlite.org/sharedcache.html, but I can be wrong. Still, if you want to use SQLite in concurent environments, then better prepare for unavoidable OperationalErrors.. – Tomasz Zieliński May 27 '10 at 09:08