3

I wrote a stored procedure in SQL Server that gets passed 4 parameters. I want to check the first parameter @table_name to make sure it uses only whitelist chars, and if not, write the attempt to a table in a different database (to both I have DML premissions).

If the name is good, it works fine, but if not, then python returns

TypeError: 'NoneType' object is not iterable

(which is expected and fine for me, as there is no such table), but it doesn't write to the table to which it was supposed to write, and the table gets stuck until I shut down the program.

When I run the stored procedure from SSMS with the same parameters, it works perfect, and writes successfully to the log table.

create_str = "CREATE PROC sp_General_GetAllData (@table_name AS NVARCHAR(MAX),\
              @year AS INT, @month AS INT, @pd AS INT) 
              AS 
              BEGIN 
                  SET NOCOUNT ON; 
                  DECLARE @sql NVARCHAR(MAX) # for later uses
                  IF @table_name LIKE '%[^a-zA-Z0-9_]%' 
                  BEGIN
                  # the 'log' table allows NULLs and the fields are used in other cases
                      INSERT INTO [myDB].[mySchema].[log]
                      VALUES (SUSER_SNAME(), NULL, NULL, NULL, NULL, NULL, 
                              'INVALID TABLE NAME: ' + @table_name, GETDATE()) 
                      RAISERROR ('Bad table name! Contact your friendly DBA for details', 0, 0)
                      RETURN
                  END 
              # do some things if the @table_name is ok...
              END"
cursor = sql_conn.execute(create_str)
cursor.commit()

# calling the SP from python - doesn't write to the log table which gets stuck
query = "{CALL sp_General_GetAllData (?, ?, ?, ?)}"
data = pd.read_sql(query, sql_conn, params=['just*testing', 2019, 7, 2])

# calling the SP from SSMS - works fine
EXEC sp_General_GetAllData 'just*testing', 2019, 7, 2

Because of the "*" inside the first parameter, it is expected to insert a line to [myDB].[mySchema].[log], which is happening only if I call the SP from SSMS, but not from python. Why?

SOLUTION:
With some luck I found out that the problem was that when the call to the SP was sent from python, the INSERT INTO clause was just not committed and it just waited for the commit order. The solution was to add auto_commit=True to the pyodbc.connect() function

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Aryerez
  • 3,417
  • 2
  • 9
  • 17
  • Have you checked with the profiler from SSMS what exact command is sent to the database ? – GuidoG Aug 21 '19 at 14:37
  • I don't think embedded python comments will work in a SQL Script but I could be wrong. I would try changing to `/* comment */` format or `-- comment` format. And it probably gets "stuck" because you're in the middle of a BEGIN / END block that never completes. – Jacob H Aug 21 '19 at 14:46

3 Answers3

3

I had the same issue, however, adding "auto_commit=True" to "pyodbc.connect()" didn't solve my problem. I solved it by adding the following command after the insert statement:

commit
Hasan Zafari
  • 355
  • 2
  • 6
0

Try this link. Maybe your problem is from your configuration of mysql connection. By using SqlAlchemy, MySql connector or other connector, auto commit is disable by default.

About MySQLdb conn.autocommit(True), Mika's Answer

M E S A B O
  • 783
  • 1
  • 11
  • 15
0

I was using pyodbc + pandas read_sql method and was experiencing this problem.

For me, the solution was to commit and close the pyodbc connection at the end of your Python script.

import pyodbc
conn = pyodbc.connect('your connection details')
df = pd.read_sql("EXEC YOUR_SP", conn)
conn.commit()
conn.close()
Beno
  • 11
  • 1