I have the following code and I'm trying to read a very big table that has over 100M rows on MariaDB. In theory execute is just going to set the cursor and then whenever I iterate over a row it's going to fetch it or at least this is what it says on the docs.
import pyodbc
cnxn = pyodbc.connect('DRIVER=/usr/lib/libmaodbc.so;socket=/var/run/mysqld/mysqld.sock;Database=101m;User=root;Password=123;Option=3;')
cursor = cnxn.cursor()
cursor.execute("select * from vat")
for row in cursor:
print(row)
I tried following versions of the code but with no results.
import pyodbc
cnxn = pyodbc.connect('DRIVER=/usr/lib/libmaodbc.so;socket=/var/run/mysqld/mysqld.sock;Database=101m;User=root;Password=123;Option=3;')
with cnxn.cursor() as cursor:
cursor.execute("select * from vat")
for row in cursor:
print(row)
import pyodbc
cnxn = pyodbc.connect('DRIVER=/usr/lib/libmaodbc.so;Server=127.0.0.1;Database=101m;User=root;Password=123;Option=3;') # tcp instead of unix socket
with cnxn.cursor() as cursor:
cursor.execute("select * from 101m") # another big table
for row in cursor:
print(row)
Update: Even without the for loop the execute itself takes a long time. And what I'm trying to do is copying data from MariaDb server to a sqlite database.