0

for some reason the insert instruction is painfully slow.

By painfully slow, I mean I go from 400 updates a second to about 5.

This is the method in question:

@app.route('/sendData/<path:data>', methods=['POST'])
def receiveData(data): #Data = '0,2C7,2C6,1C8,2C2,1' always.
    def h(point):
        cur.execute("INSERT INTO points (x, y) VALUES({},{});".format(*point.split(",")))
        g.db.commit()

    cur = g.db.cursor()
    #list(map(h, data.split("C"))) This enables the slow.
    return "Done"

When I uncomment that map, it goes painfully slow. I also tried this other method, same thing.

def receiveData(data): #Data = '0,2C7,2C6,1C8,2C2,1' always.
    cur = g.db.cursor()
    cur.execute("INSERT INTO points (x, y) VALUES({}, {});".format(*data.split("C",1)[0].split(",")))
    g.db.commit()
    if data.count("C") == 0:
        return "Done"
    else:
        return receiveData(data.split("C",1)[1])

I don't think I need to commit every single insert, but putting it at the end of each function makes no difference.

1 Answers1

0

I would suggest you collect all your data you want to insert and then build multi-row inserts with (you could test performance wise which limit is best) a 100 or more points inserted at one time.

Unfortunately depending on your version, there is not typical INSERT ... VALUES(...), VALUES(..), ... as we know it from most DBMS, but there is a solution in doing this adapted from this Question

INSERT INTO 'points' (x, y)
      SELECT 0 AS 'x', 2 AS 'y'
UNION SELECT 7, 2
UNION SELECT 6, 1
UNION SELECT 8, 2
UNION SELECT 2, 1
Community
  • 1
  • 1
hexerei software
  • 3,100
  • 2
  • 15
  • 19
  • Problem is, I want it as fast as possible and as real time. I might just use a list of points, given enough ram and save it in an sql database if I need. Maybe I'm using sql for the wrong application. Oh yeah, I'll do that. Thanks. – Rusty Shackleford Apr 09 '15 at 22:01
  • @RustyShackleford then i would switch to in-memory tables (:memory:) they are not tremendously faster, but as your table grows memory access will give better results compared to disk tables - her some one [benchmarked this] before – hexerei software Apr 09 '15 at 22:11
  • and if you really just have points - or always the same model of data with no varchar fields but actually fixed width fields, then you are probably better of just reading the whole table into a python array and traverse this - for sure faster than having python use sqllite for this - i figure you don't need queries and table joins and all that anyhow, so using database should not be your choice! – hexerei software Apr 09 '15 at 22:14
  • I do need queries and use them just as much as I write data to the table. However, the queries don't have any measurable speed penalty. But, there wouldn't be a problem with writing to the table every once in a while once the list gets too big. – Rusty Shackleford Apr 09 '15 at 22:22