0

I am trying to create a function that will accept a dataframe and will parse that dataframe into a sql server table.
I am stuck as to what needs go in the select statement below the insert query.

df- dataframe
desttable - destination table that needs to be parsed.
tablecols - An array of the table columns for the table

    # Insert DataFrame to Table
    def InsertintoDb(self, df, desttable, tablecols):
        tablecolnames = ','.join(tablecols)
        qmark = ['?' for s in tablecols]
        allqmarks = ','.join(qmark)
        #rowappendcolname = ','.join(['row.' + s for s in tablecols])
        for index, row in df.iterrows():
            cursor.execute(
                '''INSERT INTO [Py_Test].[dbo].''' + desttable + ''' ( ''' + tablecolnames + ''')VALUES (''' + allqmarks + ''')''',
                )
        self.conn.commit()


Any help is much appreciated.

user3234112
  • 103
  • 8
  • Hi, pandas has a function named df.to_sql, the only extra thing you need is SQLAlquemy, here is a link to a question with a good example https://stackoverflow.com/questions/25661754/get-data-from-pandas-into-a-sql-server-with-pyodbc try this, good luck – Evert Acosta Jun 09 '21 at 01:59

1 Answers1

0

As suggested by the gentleman in the comment, I was able to do it using df.to_sql . Here is the working code -

class DbOps:
    def __init__(self):
        self.username = ''
        self.password = ''
        self.ipaddress = 'localhost'
        # self.port = 5439
        self.dbname = ''

        # A long string that contains the necessary Postgres login information
        self.engine = sqlalchemy.create_engine(
            f"mssql+pyodbc://{self.username}:%s@{self.ipaddress}/{self.dbname}?driver=SQL+Server+Native+Client+11.0" % urlquote(f'
                {self.password }'))

    def InsertintoDb(self, df, desttable, tablecols):
        df.to_sql(desttable, self.engine, index=False, if_exists='append')
user3234112
  • 103
  • 8