4

I am currently working on a project and I use the MariaDB connector to run the queries.

I can't use ORM so I have to use raw queries.

In general, the system works fine, as expected, but when I make a bit 'big' queries I get a Too many connections error message.

This has happened to me for both MySQL and MariaDB connectors, but I mainly use MariaDB.

Example of my code (truncated / simplified):

import mariadb

def get_cursor():
    conn = mariadb.connect(
    user="user",
    password="pass",
    host="localhost",
    database="db")
    return conn, conn.cursor(named_tuple=True)

def get_duplicated_variants():
    results = []
    conn_cursor  = get_cursor()
    cursor = conn_cursor[1]
    conn = conn_cursor[0]
    try:
        cursor.execute("SELECT * FROM `db`.s_data;")
        columns = [column[0] for column in cursor.description]
        results = []
        for row in cursor.fetchall():
            results.append(dict(zip(columns, row)))
        cursor.close()
        conn.close()
        return results
    except mariadb.Error as e:
        print(f"Error: {e}")

What I've tried:

show status like '%onn%';

enter image description here

And also: show variables like 'max_connections';

enter image description here

So the max_used_connections = 152 and I have 2503 Connections.

I also tried to execute the following query:

SELECT 
CONCAT('KILL ', id, ';') 
FROM INFORMATION_SCHEMA.PROCESSLIST 
WHERE `User` = 'user' 
AND `Host` = 'localhost'
AND `db` = 'db';

As seen in this question.

But the number of connections is the same after running the query, it does not work.

How could I close the connections properly?

I don't understand why the connections are still active since I use both cursor.close() to close the cursor and conn.close() to close the connection, but the connection is still active apparently.

I know I can increase max_connections with something like: set global max_connections = 500; but I would like to close the connections from the backend after the queries are done.

Any idea?

Cheknov
  • 1,892
  • 6
  • 28
  • 55
  • @nbk `del` does not trigger garbage collection. See https://docs.python.org/3/reference/datamodel.html#object.__del__ . Calling `del` decreases the reference count of an object by 1, and potentially makes an object inaccessible by the variable name `del` was called on. This is largely separate from Python's garbage collector, which under CPython (which I assume Cheknov is using) is primarily used to clean up reference cycles. – Will Da Silva Jul 04 '21 at 18:24
  • wouldn't you want a `.close()` in a `finally:` clause? – danblack Jul 05 '21 at 23:32
  • As long as it kills the connections, sure. Could you provide an usage example? – Cheknov Jul 06 '21 at 00:37

1 Answers1

1

The API located here for connection close() certainly is clear that the connection is closed.

I realize you said you truncated the code, but seeing your comment on 2,503 connections in a single program certainly makes it seem like you aren't sharing that connection and are creating new connections for each query. I would suggest you inspect the code that you did not include to ensure you are properly storing and reusing that connection which will be expensive to keep recreating.

Finally, I would instead be looking at the state tables with something like netstat to see which connections are really going and where they are coming/going from. It isn't entirely clear to me that you are excluding connections which may be from other entities to/from the DB or that the connection isn't actually getting destroyed. In short, I am somewhat unsure if you are chasing a red herring here. I still think the >2000 connections is something which seems unexpected and you should be chasing that down first as to why so many connections are getting created in the first place, based on the code you provided.

Foghorn
  • 2,238
  • 2
  • 13
  • 35