2

I am relatively new to SQLalchemy and have done basic database creation, insert, update and delete. I have found it quite simple to use so far. My question is:

I want to move records from one database to another backup database. What is the simplest way to do this in SQLalchemy?

Ross Warwick
  • 71
  • 2
  • 4
  • Maybe I need to explain more. I am running python scripts that create databases, insert and update records using data given to it. What I want to do is move completed (filled with required data) records to another database to maintain a small current database. This will be done by the process not by someone manually doing it. – Ross Warwick Aug 28 '11 at 15:25
  • **Q-1:** How many tables does you database consist of? **Q-2:** You have some *open* and some *completed* data, and you should like to `move` *completed* data to another *archive* database. Is this right? – van Aug 29 '11 at 12:23
  • I have a database ("data.db") with one table in which records are created. The records are filled with results over time. I don't have much knowledge of sql but my thinking was I don't want the database to get to big and unwieldy, so once the records are complete those records would move to another database ("Another Database.db"). I thought a database with large amounts of records could become slow. If moving it to another table in the same database would be just as good, let me know. Any advise is welcome. – Ross Warwick Sep 01 '11 at 17:22
  • I wrote a function here to copy a single tabel: http://stackoverflow.com/a/21677971/1675586 – iman Feb 10 '14 at 13:07

2 Answers2

0

You would just go direct to the database utiltites and back it up there. Nothing to do with SQLAlchemy

Nickle
  • 367
  • 3
  • 5
0

I don't want the database to get to big and unwieldy

Database scaling is big topic, but it still fits in the realm of optimization, which can be summed up with three simple rules:

  1. Don't
  2. Don't
  3. (Experts Only) Profile First!

What this means for your question is that you probably shouldn't be optimizing for the size of your data until you have a good idea of

  • How much data do you really have?
  • What are the queries you execute regularly on that data, which queries are slow?
  • What can your database do natively to help?

What might seem at first blush to be a lot of data is often nothing to worry about. A good rule of thumb, if your dataset fits in memory, you don't have a big dataset.

Even if you do have a big dataset, it's often the case that only a small part of it is relevant, (the non "completed" rows) really affect queries. You can make that work well just by creating the right combination of indexes, so that your database can easily find and operate on the rows that you actually query.

And it might be that you are using a database for the wrong thing. What you are describing, some data comes in, hangs around until it gets processed, and then gets archived, sounds suspiciously similar to a queue. Persistent and distributed queues are widely available (Have a look at celery for a python framework built on queuing) and may be a better fit for the problem you are trying to solve.

SingleNegationElimination
  • 151,563
  • 33
  • 264
  • 304
  • Thanks for the detailed answer. I think my best option is to "do the reading". I worked with SQL because I have a small amount of experience with it. I will check out Celery because to be honest I have no idea what it is. You're reply has given me plenty to go on. – Ross Warwick Sep 04 '11 at 17:21