4

I have a django 1.4 project using mysql as the backend. I have the tests setup to run in memory

if 'test' in sys.argv:
  DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}

The issue is I need to use mysql functionality (full text indexes).

Is there a way to have django run MySQL in memory for testing?

My project relies on fulltext indexes. When the project is being developed on I syncdb then execute a .sql file with the sql to create the full text indexes.

I would like to use django orm full text searching in the functions I test. I am trying to manually add the fulltext index on each tests' initialization like:

cursor.execute('alter table mytable add fulltext(one, two)')

This does not work when using sqlite (I believe because sqlite does not support Fulltext indexing)

The above sql DOES work when I remove the in memory tests. I like the speed of in memory tests. Is there a way I can run mysql in memory?

How do people test apps that rely on database specific features? like full text indexing or gis, etc... Do they have to run the tests normally on the filesystem?

Thank you

dm03514
  • 54,664
  • 18
  • 108
  • 145
  • 2
    does mysql support tablespaces? On linux you could set the dbs table space to be placed on a RAM disk. http://www.cyberciti.biz/faq/howto-create-linux-ram-disk-filesystem/ – monkut Mar 05 '13 at 22:10

3 Answers3

15

MySQL has a MEMORY storage engine. You can activate it using the OPTIONS key:

if 'test' in sys.argv:
    DATABASES = {
        'default': {
            'ENGINE': 'django.db.backends.mysql',
            'HOST': 'localhost',
            'NAME': 'foo',
            'USER': 'bar',
            'PASSWORD': 'baz',
            'OPTIONS': {
                'init_command': 'SET storage_engine=MEMORY'
            }
        }
    }

BUT according to the documentation:

MEMORY tables cannot contain BLOB or TEXT columns.

Therefore i suppose it's pretty useless for your (and many other) use cases.

I found a few other tips for speeding up MySQL tests in this thread. Make sure to read Daniel Roseman's answer about using the INNODB engine.

Community
  • 1
  • 1
Dirk Eschler
  • 2,539
  • 1
  • 21
  • 41
3

As monkut suggested in the question comment, you may be able to get some MySql speedup by storing the test database on a RAM drive. eg a tmpfs filesystem if running on Linux. However, if you're concerned about speed during tests this may not provide much speed improvement since MySql will typically cache its data in memory anyway.

The approach I've used is to write a test decorator which will skip a test that is not supported with a given database backend, then use different settings.py files for testing against different backends.

Here, I've used django-nose functionality to help write a decorator to skip a test if any test databases use using SQLite.

from nose import SkipTest
from django.conf import settings

def skip_if_sqlite(test_fn):
    """Nose test decorator to skip test if testing using sqlite databases.

    This may be useful when we know that a test will fail when using sqlite,
    possibly because the test uses functionality not supported on sqlite such
    as database views or full text indexes.

    This decorator can be used as follows:
    @skip_if_sqlite
    def my_test_that_shouldnt_run_if_sqlite_database(self):
        pass
    """

    @functools.wraps(test_fn)
    def wrapper(*args, **kwargs):
        # Skip test if any database contain a sqlite engine.
        for alias, db_settings in settings.DATABASES.iteritems():
            if 'sqlite' in db_settings['ENGINE']:
                raise SkipTest()
            return test_fn(*args, **kwargs)

    return wrapper

Then, set up settings.test.mysql and settings.test.sqlite with different DATABASE settings. If you execute your tests with the settings.test.sqlite, the tests which rely on MySql specific functionality will be skipped, but if you're running tests against settings.test.mysql, the tests will execute.

This approach allows you specifically target tests at a particular database backend, but still provides the flexibility to run the majority of your tests against the faster SQLite database during development.

Austin Phillips
  • 15,228
  • 2
  • 51
  • 50
-4

Don't test MySQL. MySQL has been tested by its developers. Your job is to code in such a way, so that you can substitute MySQL connection, with a mock Python object which will return whatever MySQL would return (or assert if the query being sent to MySQL is what you want it to be)

Mchl
  • 61,444
  • 9
  • 118
  • 120
  • The only issue here is that there may be a number of ways to access the DB - and to mock them all would be time expensive, versus mocking too few would limit the implementation. Tests should ideally be about intent and not implementation. – Danny Staple Dec 03 '13 at 10:20
  • 5
    This may be generally a good advice... when writing "hello, world"-style programs. Anything non-trivial that talks to a database depends on that database's behavior, so testing the whole thing is a must — otherwise you'll soon find out that the db in question isn't doing exactly the same thing as your mock. – ayanami Feb 21 '15 at 20:56
  • 2
    @ayanami: same thing goes for testing on different storage engine than the one used in production. MySQL's MEMORY engine is in some cases behaving differently from both MyISAM and InnoDB. Once we're talking about integration tests (which it seems we are in this context) they should be run on the target configuration. This implies these tests will sometimes be slow. – Mchl Oct 02 '15 at 10:37