18

When I run large queries (queries returning many rows), I get the Lost connection to MySQL server during query error, and I cannot see what I do wrong. I use the "new" mysql driver from mysql.com (not the "old" MySQLdb), and the mysql version that is bundled with MAMP. Python 2.7. Table is not corrupted, analyze table nrk2013b_tbl; returns status ok. Here's an example that breaks:

#!/usr/bin/python2.7
# coding: utf-8

import sys
import mysql.connector # version 2.0.1

connection = mysql.connector.connect(
                    unix_socket="/Applications/MAMP/tmp/mysql/mysql.sock",
                     user="dbUsernam",
                      passwd="dbUserPassword",
                      db="nrk",
                      charset = "utf8",
                      use_unicode = True)
cur = connection.cursor()
cur.execute("USE nrk;")


sql = """SELECT id FROM nrk2013b_tbl WHERE main_news_category = 'Sport'"""
cur.execute(sql)
rows = cur.fetchall()

print rows

sys.exit(0)

This results in the error I get most of the time:

Traceback (most recent call last):
  File "train_trainer_test.py", line 20, in <module>
    remaining_rows = cur.fetchall()
  File "/Library/Python/2.7/site-packages/mysql/connector/cursor.py", line 823, in fetchall
    (rows, eof) = self._connection.get_rows()
  File "/Library/Python/2.7/site-packages/mysql/connector/connection.py", line 669, in get_rows
    rows = self._protocol.read_text_result(self._socket, count)
  File "/Library/Python/2.7/site-packages/mysql/connector/protocol.py", line 309, in read_text_result
    packet = sock.recv()
  File "/Library/Python/2.7/site-packages/mysql/connector/network.py", line 226, in recv_plain
    raise errors.InterfaceError(errno=2013)
mysql.connector.errors.InterfaceError: 2013: Lost connection to MySQL server during query

Line 20 is the rows = cur.fetchall()

If I limit the query to result fewer result SELECT id FROM nrk2013b_tbl WHERE main_news_category = 'Sport' LIMIT 10 all is well. But I do want to work with larger result sets. For some ad-hoc problem solving I have moved the limit and broken down the data I wanted into smaller batches, but this keeps popping up as a problem.

In order to take connect-timeout, and max_allowed_packet, etc into account, I have this my.cnf-file: File: /Applications/MAMP/conf/my.cnf

[mysqld]
max_allowed_packet = 64M
wait_timeout = 28800
interactive_timeout = 28800
connect-timeout=31536000

This does not seem to make any difference (I'm not even sure if mysql recognises these settings). When I run queries from the terminal or from Sequel Pro, it works fine. It is only through the python mysql.connector I get these errors.

Any ideas?

PS: I've temporarily given this up, and changed to PyMySQL instead of of the Oracle mysql.connector. By changing to this, the problems seems to disappear (and I conclude for myself that the problem is in the oracle mysql connector).

import pymysql
conn = pymysql.connect(
                    unix_socket="/Applications/MAMP/tmp/mysql/mysql.sock",
                     user="dbUsernam",
                      passwd="dbUserPassword",
                      db="nrk",
                      charset = "utf8",
                      use_unicode = True)
conn.autocommit(True)
cur = conn.cursor()
Pikamander2
  • 7,332
  • 3
  • 48
  • 69
Eiriks
  • 479
  • 2
  • 11
  • 20
  • FYI: http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_connect_timeout – Shaeldon Oct 22 '14 at 14:39
  • 1
    That looks perfect, I just cannot seem to get it to work. When I run the queries from the terminal or from Sequel Pro, it works fine. It's only when I do it through the python mysql.connector it breaks. – Eiriks Oct 23 '14 at 11:09
  • Same issue working with python 3.6.5 on macOS. Changing from mysql.connector to pymysql did the trick. Thanks. – cucu8 Dec 27 '18 at 11:33

5 Answers5

18

I also had to switch to PyMySQL. I am running pip 1.5.6, Python 2.7.8, and tried mysql-connector 2.0.1

I was able to run the query from within Sequel Pro with no problems, but my Python query would fail with the error described in the question after returning just a subset of results.

Switched to PyMySQL and things work as expected.

https://github.com/PyMySQL/PyMySQL

In the virtualenv:

pip install pymysql

In the code:

import pymysql

connection = pymysql.connect(user='x', passwd='x',
                                 host='x',
                                 database='x')

cursor = connection.cursor()

query = ("MYQUERY")

cursor.execute(query)

for item in cursor:
    print item

Definitely a bug in mysql-connector-python.

sheldonkreger
  • 858
  • 1
  • 9
  • 25
3

Try increasing your net_read_timeout (probably a default value of 30secs is too small in your scenario)

Ref:

net_read_timeout

and in general:

B.5.2.3 Lost connection to MySQL server

Cristian Porta
  • 4,393
  • 3
  • 19
  • 26
  • Thanks for the suggestion. Does not seem to make any difference though. I still think this has to do with the connector, not mysql itself or its settings. The queries work fine from terminal & Sequel Pro. It's just through the oracle connector this does not work. – Eiriks Oct 24 '14 at 11:50
2

I encountered similar problems too. In my case it was solved by getting the cursor in this way:

cur = connection.cursor(buffered=True)
1

Looks like a bug in MySQL Connector/Python: http://bugs.mysql.com/bug.php?id=74483

Should be fixed in 2.0.3, which is not yet released.

Che
  • 1,691
  • 1
  • 15
  • 8
0

Expanding on Christian's answer. Timeout for read queries (select) are set by net_write_timeout. It is a "write" from the perspective of the server.

codebard
  • 146
  • 1
  • 5
  • Actually not true `The number of seconds to wait for more data from a connection before aborting the read. When the server is reading from the client, net_read_timeout is the timeout value controlling when to abort. When the server is writing to the client, net_write_timeout is the timeout value controlling when to abort.` - from https://dev.mysql.com/doc/refman/5.5/en/server-system-variables.html#sysvar_net_read_timeout – papadp Feb 19 '19 at 16:28